IF YOU ARE one of the people craving large amounts of memory in your server but wincing at the DIMM counts for Sandy Bridge and Bulldozer servers, there is hope. That hope comes in the form of an LR-DIMM.
LR-DIMMs are a JEDEC version of the late, lamented MetaRAM technology, basically faking a DRAM chip with a bunch of small ones. LR stands for Load Reduced, and that is exactly what the buffer on the DIMM does.
The net effect is you can make a 4GB DIMM from 32 1Gb chips and the system only sees 8 4Gb chips. This makes the life of the memory controller much easier, and effectively quadruples DIMM capacities.
There is a down side, the LR-DIMM needs a buffer, and that chip tends to be quite expensive. Luckily, the cost of very high density DRAMs tends to be even more expensive, so the buffer not only makes the impossible possible, it makes it less expensive than most other routes. If they were possible. Or something like that. My head hurts.
One twist is that the LR-DIMMs seem to need explicit support, or at least awareness on the system side that they are playing tricks. MetaRAM didn’t seem to need this, so it may be the case that the LR-DIMM buffer is not quite as comprehensive a solution. Then again, ‘not quite as comprehensive’ usually means ‘vastly cheaper’, so it may be a very good tradeoff.
In the end, the next generation of server CPUs should both support LR-DIMM technology out of the gate, so it is a moot point. Inphi demonstrated the technology up and running during IDF a few months ago, so things are close. LR-DIMMs will make 32GB DIMMs something between possible and common, with higher density sticks quite doable should there be a wallet large enough to support them.S|A
Latest posts by Charlie Demerjian (see all)
- AMD’s new Tonga aka R8 285 may have new tech - Sep 2, 2014
- AMD talks a little bit more about Seattle - Aug 28, 2014
- Memblaze makes flash aware wide RAID solutions - Aug 22, 2014
- Bland title for a good GPU story on AMD and Nvidia - Aug 20, 2014
- Analysis: What does the automotive market look like for Nvidia? - Aug 19, 2014