With the lack of DDR4, or any new standard for that matter, on the horizon, DRAM improvements are taken as power savings, not big speed bumps. The market is interested in lowering power, speed isn’t as much of a concern, especially in the server market. That said, Samsung had a bunch of goodies to show off.
32GB in a single DIMM, closer to ‘enough’
The three above are big, but the bottom one in particular is quite the engineering feat. If you recall, LRDIMMs like the center one, have a buffer on board to allow for greater capacities. That is all fine and dandy, but doing a 32GB DIMM like the bottom one without the buffer is quite a trick. The top UDIMM is just very small, low profile 8GB DIMM. Hard to do, but nothing spectacular.
8GB in for laptops, now with ECC
A few booths away, Crucial was showing off a production 8GB SO-DIMM, and Samsung has one too. Actually, Samsung has two too, with one being ECC protected. It says for Atoms, but most craptops won’t need ECC, so it is probably aimed at industrial designs and embedded applications.
The last things Samsung had to show off don’t do well in pictures, they just look like any other memory chip. That said, DRAM doesn’t do well in pictures in general, it all kind of looks the same. le sigh. In any case, Samsung was showing off a 4Gb DRAM chip, that is 512MB per part mind you, which made up the 32GB DIMMs above. To balance that out, they had 1.25V DDR modules, and 4Gb LPDRR2 chips too.
With all these parts in the catalog, you can make some really big and really low power DIMMs. Once a faster DRAM spec is approved, you can trade in some of that headroom for speed, and off you go. This may not be the stuff of headlines, but it is important, and more capacity is never a bad thing.S|A
Latest posts by Charlie Demerjian (see all)
- Intel shows off 10nm 112Gbps SerDes - Mar 12, 2019
- Intel releases Compute Express Link spec - Mar 11, 2019
- Qualcomm rolls out a second gen 5G modem called X55 - Feb 19, 2019
- What is Intel’s Foveros tech and what isn’t it? - Feb 11, 2019
- Why SemiAccurate called 10nm wrong - Jan 25, 2019