orsm.net
Orsm.net on Facebook
THIS CHANGES EVERYTHING. IT'S THE BEST THING TO EVER HAPPEN TO ONLINE PORN. SERIOUSLY. EVER. CLICK TO SEE IT FOR YOURSELF.
orsmness


Click for more awesomeness

Orsm.net Server Build...

I did this little write up because I thought some of you guys may be interested in reading what goes in to running the site. The next couple of pages shows the sequence in which everything was done to take Itchy and Scratchy - Orsm.net's new servers - from bits and pieces in boxes and bags to becoming two kick ass servers.

Late last year I started tinkering with the idea of adding a second server to run the site. Things were just starting to slow down and the bandwidth the server plugged in to was getting a touch congested as more and more of you guys came here. After much consulting with friends, people in the know and very little research I decided to take the plunge and build a couple of servers instead of using a rented machine. Great idea. Simple in theory - not quite that way in implementation.

The first thing we did was work out what was required. It was decided early on that it was now time to go from a single server to two. We could have built one kick ass machine but I'd end needing a second one eventually so not much point. Next we had to figure out what would need to be run where - things like Apache, DNS, MySQL, PHP, mail and a whole heap of other things all had to be taken into consideration. There was no point setting them up to load balance eachother - too much screwing around involved. This way, one machine handles particular tasks. Itchy is primarily a just serving pages and stores all the html, images and php that you guys see. Scratchy handles video downloads, mail and MySQL stuff.

After that we compared the specs of the current and previous servers, our home computers, how they handled growth and traffic increases and what the future may hold all in aid of getting a better idea of how much horse power was required.

Whilst I didn't set a budget to work within, I tried as hard as possible to keep costs down. How to do this with out skimping was the hard part. I had a few suggestions that would have been unreal such as SCSI RAID but when you consider that each server would have needed SCSI/RAID cards and three SCSI drives each, the cost of the whole project would almost have doubled. The other thing we wanted to use was dual Xeon processors but once again cost was prohibitive for what we were doing.

Anyways after much deliberation and constant back and forward with Chris, Tim and myself we decided the following would cut it [the below is for 2 machines]:

2 x Intel Pentium IV 2.4Ghz 478- Boxed with HS & Fan; 512K NorthWood; 533FSB
1 x 52X LG Speed CD ROM Drive
4 x 40 Gb Western Digital 7200 RPM UATA 100; JB Model; 8Mb Cache
2 x D845PESV Intel D845PESV, Pentium 4 (533/667MHz) M/B for 333MHz DDR 533 AGP4X ATX PC2700
2 x Panasonic 1.44 MB FDD Drive
2 x Intel Pro/100-S, Desktop Adapter with i82550EY Fast Ethernet controller, 3DES 168Bit
4 x Direct PC 512MB PC2700 400MHz DDR RAM
2 x Sparkle PCI Nividia TNT-2 M64 Chipset
2 x 2u SVEC server cases w/ 300x power supply
2 x Server rails
2 x Promise FastTrack100 ATA100 PCI RAID 0,1 2-Channel Controller OEM

So in a nutshell EACH machine is a P4 2.4Ghz running on an Intel board with 1Gb of RAM, Promise RAID controllers [RAID 0] across dual 40Gb Western Digital hard drives all sitting in SVEC 2U cases with 300w power supplies. Not bad huh!?

A couple of people suggested I should have gone with Athlon processors but from what I have seen and read I don't think it was a viable option when you consider the servers were going to be on the otherside of the world from me. I've always used Pentiums and I've never had a problem with them - no need to change that now.

Getting the machines built wasnt too hard. Chris and I did it over a couple of days and didn't have a drama with any of the hardware I bought. Everything fitted like it was supposed to and Itchy and Scratchy were born.

Then came time to get an operating system installed on them both. Piece of piss I thought - I've installed Linux dozens of times. Note: Here's where I started to become frustrated. Seems the RAID controllers I chose to use dont have good Linux driver support.

After much fucking around and probably 15 attempts to get Red Hat, Debian and Mandrake installed, Tim waltzes in and gets Slackware running in about an hour. Bastard. This ofcourse was done after a couple of beers were consumed so if anything fucks up really badly we know who to blame. Spent the next day transferring all the site files off my home computer onto the servers. They're more or less ready by this stage.

Next hurdle to over come was shipping. Dropped them off on the Tuesday only to find out that they wont be shipped until Friday [not impressed]. They finally exit Perth, get shipped directly to Auckland in New Zealand, Los Angeles and then onto Dallas in Texas. After a few hassles with paper work and a mildly assertive email from myself the boxes were passed off to US Customs where they sat until Monday. Wednesday Perth time the servers have arrived at the colocation facility and are plugged in. Success!

Itchy and Scratchy also see a big increase in available bandwidth too. Up until now the site has run off a burstable 10Mb pipe meaning that the fastest the data can go through there is 10Mb. The problem with that was that over the last few months the site wasn't having too much trouble maxing out the link. We've now got 30Mb between to two machines which means everything should be nice and fast.

Anyways that about wraps up the story of how everything came together. The next few weeks of you guys surfing the site will tell the tale if it was all worth it I guess. It's been one of the most stressfull months I've ever had as well. So much shit I have had to learn and work out to make this go smoothly I'm amazed that you guys are actually reading this page!

One last thing - HUGE thanks to Chris, Tim and Chris M. Without their help this would never have happened.

Anyways, the next couple of pages are just progress photos. Check em out...

Picture of the stuff required to build the servers [except for the video cards which we couldn't get until the next day and the RAID controllers]
Same again...
SVEC Server cases... or the box the came in at least. Very nice cases and they're black too!
Group photo of hardware. Was actually quiet exciting to see everything all together like this. Chris and I managed to do four hardware retailers in two hours on a mad dash Friday afternoon. Almost managed to crash the car in the process too!
One of the processors. Both of them are Pentium 4 2.4Ghz. I was tempted to opt for dual Xeon's at first but cost got in the way. Next idea was to go for the P4 3.06Ghz because it does multi-threading like the Xeon's do but they are ridiculously expensive and well and truly over kill for my needs. I'm sure that the 2.3's will do the job nicely.

Four Western Digital 40Gig hard drives - 2 per machine running RAID 0 [striping]. In other words these thinsg are configured for speed only. The beauty of this set up is also its biggest drawback. RAID 0 splits data across the two drives so when a file is requested it comes from the two drives - thus twice as fast. The downside is that if one of the drives fails you lose all your data because essentially the other drive only has half of the data for all of your files. Make sense?

Two Intel main boards. They're not server main boards but dont forget I was trying my utmost to keep cost in mind here. The boards have onboard sound [which was disabled so it matters not] but no onboard video. We went this way because apparently onboard video noticably creates a greater overhead and drain on resources.
Two floppy drives. Just cheapies. Didn't really matter what they were because beyond setting the servers up there would be little oppurtunity foor them to ever be used again.
Four Direct PC 512Mb memory modules. Two of these per machine giving 1Gig of memory to each server. Sweeeeet.
Two el cheapo PCI graphics cards. We had to use PCI due to space limitations inside the 2U cases - AGP cards wouldnt fit. Kind of stupid that the only cards we could find were 32Mb but they were a nightmare to find. Seems no one keeps PCI video cards in stock anymore.
LG CD-ROM drive. We ended just buying one of these and swapping it in and out of the servers as needed whilst we set up the operating systems. Ended up deciding not ship either of them with it to ensure the cases had optimum airflow and the fact it would never be used again considering my lack of physical access to the machines.
PCI riser cards. Used to reposition the other cards - network, raid controller and video - horizontally so that they would all fit inside the 2U case.
Two Intel network cards. Last thing we wanted was a NIC that crap itself so there was no other choice except for Intel 10/100 cards.
Server rails. Used to mount the server cases to the racks when the arrive at the facility in the US.

 

Click for more awesomeness

 

 

orsmfeatured