Casey Bralla on 1 Jan 2010 11:42:13 -0800 |
Believe it or not, I've got __another__ system with instability problems. I bought a new motherboard, RAM, & CPU for Christmas for a light-duty server. It was giving me some spontaneous reboots, so I've been running the memtest program. It's gotten a lot worse in the last few hours, but the results are a little strange to me. I was hoping someone here could help me understand. (No, Eric, this one runs Debian, so I haven't worn it out by compiling everything <grin>) There are 2 DIMM slits and I have a matched pair of two 2-GByte DDR2-800 DIMMs. If I put either of the DIMMs in either slot, it runs fine. However, if I put BOTH DIMMs in, the system reboots about 10 seconds into the memtest routine. I put a pair of 1-GByte DDR-1066 DIMMs I had in another computer into this server (both slots) and it ran fine. I've also tried raising the voltage and slowing the access to DDR2-400 speeds, but nothing has helped. I'm concluding that the RAM is bad and will RMA it to New Egg. But I'm feeling a little uncomfortable about this. Might it be the motherboard that's bad? How can RAM be good when run as a single chip, but fail in pairs? BTW, when I run them in pairs, they are in interleaved mode, but obviously are not interleaved when run as a single DIMM. -- Casey Bralla Chief Nerd in Residence The NerdWorld Organisation http://www.NerdWorld.org ___________________________________________________________________________ Philadelphia Linux Users Group -- http://www.phillylinux.org Announcements - http://lists.phillylinux.org/mailman/listinfo/plug-announce General Discussion -- http://lists.phillylinux.org/mailman/listinfo/plug
|
|