MODFEST may be cancelled this summer.....why you ask?
Well, its that time...it's been five years since the last computer lab was built at the school I work for. As the System Admin, its my job to design and build a new one. Every five years our non profit school gets a nice grant to build a new lab.
Now since its not guaranteed that we get a grant, the design life needs to be 7 years. So this time, the formula is the same....computers about 2x-3x as powerful as they need to be at the present, Gigabit everywhere and a good stash of spare parts. (enough to totally rebuild one machine)
Its gonna be pretty crazy for a school with 60 students....2500 feet of Cat6 cable, a nice 48 port Cisco switch, 13 new PCs, a new server, HDTV in every classroom (giant monitor for TeacherTube, Bill Nye and Wikipedia!), tons of new software and the elimination of ALL our old, nasty, unreliable networking equipment that I've gotten tired of bringing back from the brink.
3 servers are going to be virtualized into one, saving over 400W of power and allowing all servers to have battery backup.
Designing a computer that can last 5-7 years is a challenge. We did OK last time...the machines aren't bad even today but they are starting to have a few hardware failures due to age and the less than ideal A/C in the computer room. (its HOT in there some days.)
So what did we put in the lab computers back in 2006?
Athlon 64 3500+ (Socket 939, Venice core)
Corsair Value Select 1GB DDR 3200
250GB WD Caviar SE16 SATA HDD
Geforce 7600 GS 256MB (fanless!)
Gigabyte nForce 4 SLI motherboard (thanks to a sale, the SLI version as $15 less...go figure)
NEC 16x DVD-RW drive
Floppy drive (once we introduced flash sticks, these became "dust drives" :p )
350W Power supply
17" 1280x1024 LCD with DVI interface. (Our less visually able students and teachers love the sharp text)
Windows XP Pro x86
Mid tower ATX case
2x AMD Opteron 248 CPUs
Crucial 4GB DDR 3200, registered, ECC
2X WD Raptor 150GB in RAID1, replaced under warranty with 150GB VelociRaptor after both drives failed within a one week period
Geforce 7600 GS 256MB (fanless) (2 displays on the server has been neat)
Asus nForce 2200 Professional motherboard
650W Power supply
16x DVD reader, 16x DVD-RW
Floppy drive (ESSENTIAL for loading the RAID driver during OS install)
Multi card reader
Windows Server 2003 R2 x86 (needed legacy program support...another thing we do away with in 2011!)
Full tower case
Now the new lab is up in the air....but its gonna be something along these lines:
AMD "Bulldozer" or Intel i5 2xxx "SandyBridge" CPU (depends on if AMD delivers the CPU in time for our build.)
8GB DDR 3 RAM
1TB WD Caviar Black HDD (64MB cache, 5 yr warranty, reasonably priced, its my pet drive)
Some sort of lower midrange GPU...Geforce 440 or equivalent from AMD/ATI
520W 80 Plus Gold certified PSU (Expensive as hell but saves about a half a space heater's worth of heat and will pay for itself in 4 years)
BD-ROM/DVD-RW combo drive (right now our apps are on DVD...in less than 5 yrs, they will probably be on BD-ROM)
20" widescreen LCD, HDMI interface. (looking at Asus and Acer)
Microsoft keyboard, Logitech mouse. (we went through IntelliMouse Explorers left and right with kids clicking like they were Hulk, they haven't found a way to kill an MX-510/518 yet)
Same Sony MDR-150 headsets....unless they have steel armored cables, any headphone will eventually succumb to the munchkins!
Mid tower ATX case
Multi card reader
Same case, the good old Lian-Li PC70 (it looks good, runs cool, fits BIG motherboards, its not obsolete yet!)
2X AMD Opteron, 8 core (cheap, good virtualization performance)
8GB DDR3 ECC registered RAM
3x HDDs in RAID 5 with hardware controller for OS, investigating SAS vs SATA from a price/performance perspective.
2X WD Caviar Black 2TB in RAID1 for user file storage
850W 80 Plus Gold certified PSU
BD-ROM/DVD-RW combo drive
Fedora 14 Linux or LFS with SE Linux kernel and Windows Server 2008 R2 running in KVM
Basic GPU, probably same as the workstation for simplicity's sake. (hey, I like my 1080P framebuffer console!)
Network infrastructure as it stands today:
Linksys WRT54-GS router with HyperWRT firmware
24 port 3com unmanaged Gigabit switch
Numerous fanout switches of questionable pedigree (no Airlink crap though!)
Cat5e cabling in the ceiling, some cat5. (cat5e can go GbE over short lengths)
Custom firewall, content filter and caching system made from Linux and a used Rackable Systems server, offline due to unknown failure.
In other words, our network sucks but when you're non profit, you build what you can....we quickly expanded beyond our original network and its now a Frankenstein monster. This time we have put aside some extra cash for some REAL networking hardware.
New Network as proposed:
Cisco router (this will serve firewall duties as well)
Cisco 48 port gigabit switch, no PoE (we don't need it)
Content filter/cache becomes a VM on the server, old Rackable box retired
Cat6 cabling and patch panels from end to end....GbE EVERYWHERE.
Two 802.11g access points
No fanout switches in the classrooms that kids can step on, kick or otherwise bring offline