Estimating 3D XPoint Die Size

By now most of you probably know that I'm a sucker for die sizes and since this is information that the DRAM and NAND vendors are unwilling to share, I've gone as far as developing my own method for estimating the die size (well, it's really just primary school geometry, so I can't take too much credit for it). Die size is the key factor in determining cost efficiency because it directly relates to the number of gigabytes each wafer yields and thus it's a vital metric for comparing different technologies and process nodes.

I'm borrowing the above picture from The SSD Review because to be honest my wafer photos (and photos in general) are quite horrible and wafers are far from being the easiest object given all the reflections. Sean is a professional photographer, so he managed to grab this clear and beautiful photo of the production 3D XPoint wafer Intel and Micron had on display, making it easy to estimate the die size.

I calculated 18 dies horizontally and 22 vertically, which yields 227mm^2 with a normal 300mm wafer. When taking die cuts (i.e. the space between dies) into account, we should be looking at 210-220mm^2. Array efficiency is about 90%, which is much higher than planar NAND because most of the peripheral circuitry lies underneath the memory array.

IMFT 20nm 128Gbit MLC NAND die

For comparison, Intel-Micron's 20nm 128Gbit MLC NAND die measures 202mm^2 and has array efficiency of ~75%. From that we can calculate that the 128Gbit memory array in 3D XPoint takes about 190mm^2, while a similar capacity planar NAND array measures ~150mm^2 (since the 128Gbit 3D XPoint die consists of two layers and 128Gbit MLC NAND die stores two bits per cell, the number of layers and bits stored per cell cancel out). It seems like NAND is denser (about 20-25%) from a memory array perspective given a fixed feature size (i.e. lithography), but at this point it's hard to say whether this is due to the cell design itself or something else. Connecting layers of wordlines and bitlines to the intermetal layers likely takes some extra area compared to a 2D process (at least this is the case with 3D NAND), which might partially explain the lower density compared to NAND.

However we will have to wait for some SEM photos to really see what's happening inside the 3D XPoint array and how it compares to NAND in cell size and overall density efficiency. Of course, there is a lot more in total manufacturing cost than just the cell and die size, but I'll leave the full analysis to those with the proper equipment and deeper knowledge of semiconductor manufacturing processes.

What Happens to 3D NAND

The above analysis already gives a hint that 3D XPoint isn't about to replace 3D NAND, at least not in the foreseeable future. That's also what Intel and Micron clearly stated when asked about 3D XPoint's impact on 3D NAND because it's really a new class of memory that fills a niche that DRAM and NAND cannot. The companies are still looking forward to rolling out 3D NAND next year and have a strong roadmap of future 3D NAND generations.

As I mentioned earlier, the way 3D XPoint array is built is quite different from 3D NAND and my understanding is that it's less economical, which is one of the reasons why the first generation product is a two-layer design at 20nm rather than dozens of layers at a larger lithography with single patterning like 3D NAND is. Unless there's a way to build 3D XPoint arrays more like 3D NAND (i.e. pattern and etch multiple layers at the same time), I don't see 3D XPoint becoming cost competitive with 3D NAND anytime soon, but then again it's not aimed to be a NAND successor in short-term.

What happens in ten year's time is a different question, though. 3D NAND does have some inherent scaling obstacles with vanishing string current likely being the biggest and most well known at this point. Basically, the channel in each 3D NAND "cell tower" (i.e. a stack of layers, currently 32 for Samsung and Intel-Micron) is a single string that the electrons have to flow through to reach every individual cell in the string. The problem is that as the length of the string increases (i.e. more layers are added), it becomes harder to reach the top cells because the cells on the way cause disturbance, reducing the overall string current (hence the name "vanishing string current"). For those who are interested in a more detailed explanation of this issue along with some experimental data, I suggest you head over to 3D Incites and read Andrew Walker's post on the topic.

Since most vendors haven't even started 3D NAND mass production, it's not like the technology is going to hit a wall anytime soon and e.g. Toshiba-SanDisk's 15nm NAND has strings consisting of 128 cells, but like any semiconductor technology 3D NAND will reach a scaling limit at some point. Whether that is in five, ten or twenty years is unknown, but having a mature and scalable technology like what 3D XPoint should be at that point is important.

The Technology Products & Applications

• dlop - Friday, July 31, 2015 - link

I'm still using only 5400 RPM HDDs. I'm waiting for DRAM-less SSD with 2-bit MLC memory, power loss protection and 5 year warranty.
• jamyryals - Friday, July 31, 2015 - link

Buy an enterprise SSD, they have those features.
• dlop - Saturday, August 1, 2015 - link

Enterprise SSDs are too expensive for low-end home desktop PC. Removing DRAM would make them cheaper because less capacitors should be needed. SSDs could be probably cheaper if they use solid capacitors which are used for motherboards. Bigger size of those capacitors isn't problem for desktop PC.
• Zan Lynx - Sunday, August 16, 2015 - link

You didn't specify cheap in your first comment. And you know that you can't have everything. If you want cheap then you give up something else. Like 5 year warranties and power loss protection.
• dlop - Monday, August 17, 2015 - link

Intel 535 is DRAM-less SSD with 5 year warranty without power loss protection. Solid motherboard capacitor cost about \$1. DRAM-less PCIe or M.2 SSD with motherboard capacitors shouldn't be much more expensive than Intel 535.

Intel 750?
• MrBowmore - Friday, July 31, 2015 - link

Not DRAM-free but with big capacitators that fills the disk with enough power to write down the cache. Most of the ram is for the indexing table anyway. Mabye you just hate DRAM?
• JKflipflop98 - Monday, August 3, 2015 - link

Awfully picky for someone willing to put up with such a crappy drive for so long.
• toooskies - Friday, July 31, 2015 - link

Not one mention of the mobile market, when they're an ideal place to replace DRAM + NAND. The fact that it's non-volatile will cut idle power usage, and you save PCB space by including it all in a single chip. Obviously database servers will be huge, but the place we're likely to see this stuff on the consumer market is in a cell phone.
• Freakie - Friday, July 31, 2015 - link

Bottom of the Products & Applications page talks about mobile devices. It'd be great for all of the low-end to mid-range smartphones but the high end ones that do benefit from fast RAM are likely to keep using RAM while the more mainstream phones could potentially switch to Xpoint.