With so much going on I've become delinquent (as usual) in updating you all, but I'm hopefully at the turning point for that. Pentium D 805, Socket-AM2 and MacBook Pro were all very interesting, but right now the main thing I'm working on is performance under Bethesda's hottest title: Oblivion.
Oblivion is turning out to be the most stressful game I've run on the latest hardware, and its performance stats make that very clear. Even with a pair of Radeon X1900 XTs you are still unable to run at 1600 x 1200 with everything turned up and get smooth frame rates everywhere. Although the majority of my testing thus far has been with GPUs later this week I will start to look at the impact of the platform as a whole on Oblivion, eventually leading into CPU performance using the title.
The game itself is quite possibly the best I've played in several months at the least; that's a pretty big compliment coming from someone who historically isn't the biggest RPG fan, but Oblivion is accessible enough to make it fun for just about anyone.
My Oblivion testing put me in a situation where I had to deal with basically every GPU released over the past 2 - 3 years, which in turn left me with a handful of gripes that I don't think I've properly voiced here - most of them involving CrossFire/SLI.
Just so you all know what happens behind the scenes, whenever we write a story even remotely hinting at the idea of running SLI on a non-NVIDIA platform we usually get several angry emails from NVIDIA. And while I would never recommend purchasing a non-NVIDIA motherboard with hopes of running SLI on it, NVIDIA's reaction does highlight a much larger problem. NVIDIA, and to a lesser extent ATI, are far too focused on delaying the transition to a truly seamless multi-GPU environment in order to try and lock customers in to purchasing one chipset or another.
While in the mainstream, NVIDIA's policies don't really hurt it, at the very high end it is nothing but annoying. Although ATI's CrossFire Xpress 3200 (RD580) chipset is, finally, competitive with NVIDIA's SLI offerings the latter is simply found on more motherboards that have been around for much longer. So I can see why a lot of users would still prefer to go the safe route with a NVIDIA SLI platform instead of the ATI chipset. But if you're looking for the absolute highest performance in Oblivion, you'll want a Radeon X1900 XT(X) and if you can afford it and want even better performance you'll want two. Unfortunately that means that you've got to change your purchase around to either buy NVIDIA GPUs or an ATI motherboard, which may not be what you originally wanted to do.
It hurts ATI as well, take the example above. ASUS' RD580 board has gotten some great reviews and is a very solid motherboard, making it very good competition for its SLI x16 motherboard. But if you want to use NVIDIA GPUs with it in SLI (or with the hopes of someday upgrading to SLI) you're out of luck. Once again your GPU choice ties you into a particular chipset choice.
Although there is a lot of validation that goes in to testing and certifying SLI/CrossFire platforms, there's no harm (to the end user) in at least offering "at your own risk" support and seeing what sort of response there is. While publicly the problem is always stated to be about validation and guaranteeing an excellent user experience, the real goal is to force exclusivity within a computer.
With a tremendous install base of SLI platforms, NVIDIA is far less likely to just wake up one day and offer support for ATI and Intel chipsets (unless one of them ponies up and pays a lot of money) so instead I turn to ATI. ATI has already enabled support for CrossFire on Intel 975X platforms, and if it wants to gain further acceptance of its multi-GPU solution it should do the same on NVIDIA SLI platforms. While that could potentially hurt its chipset sales, it also has the potential of increasing GPU sales if ATI can be the only company to offer a multi-GPU solution that works on any chipset.
Ideally it would also push NVIDIA to do the same, and hopefully mean an earlier end to what is truly a silly situation. By not offering universal multi-GPU solutions that work on any platform equipped with the right number of PCIe slots ATI and NVIDIA are not working in the best interests of its customers and are rather publicly operating in the best interests of its own pockets. Although it's unfortunately rare for the customer to come first, the current multi-GPU platform situation is a bit more pronounced than usual.
ATI has already taken the first step by offering support for Intel 975X platforms, unfortunately at a time when Intel's platforms aren't very popular among gamers. While it's a nice (and perhaps calculated) gesture, I want more. The question is, will ATI go the rest of the way?
I'm headed off to the airport now, tomorrow Vinney and I have two meetings that should hopefully be the last two things that we'll need to sign off on before the house can be finished up. We're finalizing our hardwood floor stain color and I'm doing a final walkthrough with the structured wiring guy to make sure that the excessive amounts of CAT5e are where they should be. With those two things done, we hope to close on the house during the second week of May. The trip down to NC will be a short one because of work, I should be back on Wednesday with Oblivion as my top priority. If there's anything in particular you'd like to see, let me know and as always I'll do my best to include it.
Once again, I'm sorry for letting you all fall behind on updates as to what's going on but I hope to change that once this move back home finally happens. There's a lot of changes here at AT that are in the process of happening as well which I will update you all on at another time. Until then, take care and have a great week :)
Oblivion is turning out to be the most stressful game I've run on the latest hardware, and its performance stats make that very clear. Even with a pair of Radeon X1900 XTs you are still unable to run at 1600 x 1200 with everything turned up and get smooth frame rates everywhere. Although the majority of my testing thus far has been with GPUs later this week I will start to look at the impact of the platform as a whole on Oblivion, eventually leading into CPU performance using the title.
The game itself is quite possibly the best I've played in several months at the least; that's a pretty big compliment coming from someone who historically isn't the biggest RPG fan, but Oblivion is accessible enough to make it fun for just about anyone.
My Oblivion testing put me in a situation where I had to deal with basically every GPU released over the past 2 - 3 years, which in turn left me with a handful of gripes that I don't think I've properly voiced here - most of them involving CrossFire/SLI.
Just so you all know what happens behind the scenes, whenever we write a story even remotely hinting at the idea of running SLI on a non-NVIDIA platform we usually get several angry emails from NVIDIA. And while I would never recommend purchasing a non-NVIDIA motherboard with hopes of running SLI on it, NVIDIA's reaction does highlight a much larger problem. NVIDIA, and to a lesser extent ATI, are far too focused on delaying the transition to a truly seamless multi-GPU environment in order to try and lock customers in to purchasing one chipset or another.
While in the mainstream, NVIDIA's policies don't really hurt it, at the very high end it is nothing but annoying. Although ATI's CrossFire Xpress 3200 (RD580) chipset is, finally, competitive with NVIDIA's SLI offerings the latter is simply found on more motherboards that have been around for much longer. So I can see why a lot of users would still prefer to go the safe route with a NVIDIA SLI platform instead of the ATI chipset. But if you're looking for the absolute highest performance in Oblivion, you'll want a Radeon X1900 XT(X) and if you can afford it and want even better performance you'll want two. Unfortunately that means that you've got to change your purchase around to either buy NVIDIA GPUs or an ATI motherboard, which may not be what you originally wanted to do.
It hurts ATI as well, take the example above. ASUS' RD580 board has gotten some great reviews and is a very solid motherboard, making it very good competition for its SLI x16 motherboard. But if you want to use NVIDIA GPUs with it in SLI (or with the hopes of someday upgrading to SLI) you're out of luck. Once again your GPU choice ties you into a particular chipset choice.
Although there is a lot of validation that goes in to testing and certifying SLI/CrossFire platforms, there's no harm (to the end user) in at least offering "at your own risk" support and seeing what sort of response there is. While publicly the problem is always stated to be about validation and guaranteeing an excellent user experience, the real goal is to force exclusivity within a computer.
With a tremendous install base of SLI platforms, NVIDIA is far less likely to just wake up one day and offer support for ATI and Intel chipsets (unless one of them ponies up and pays a lot of money) so instead I turn to ATI. ATI has already enabled support for CrossFire on Intel 975X platforms, and if it wants to gain further acceptance of its multi-GPU solution it should do the same on NVIDIA SLI platforms. While that could potentially hurt its chipset sales, it also has the potential of increasing GPU sales if ATI can be the only company to offer a multi-GPU solution that works on any chipset.
Ideally it would also push NVIDIA to do the same, and hopefully mean an earlier end to what is truly a silly situation. By not offering universal multi-GPU solutions that work on any platform equipped with the right number of PCIe slots ATI and NVIDIA are not working in the best interests of its customers and are rather publicly operating in the best interests of its own pockets. Although it's unfortunately rare for the customer to come first, the current multi-GPU platform situation is a bit more pronounced than usual.
ATI has already taken the first step by offering support for Intel 975X platforms, unfortunately at a time when Intel's platforms aren't very popular among gamers. While it's a nice (and perhaps calculated) gesture, I want more. The question is, will ATI go the rest of the way?
I'm headed off to the airport now, tomorrow Vinney and I have two meetings that should hopefully be the last two things that we'll need to sign off on before the house can be finished up. We're finalizing our hardwood floor stain color and I'm doing a final walkthrough with the structured wiring guy to make sure that the excessive amounts of CAT5e are where they should be. With those two things done, we hope to close on the house during the second week of May. The trip down to NC will be a short one because of work, I should be back on Wednesday with Oblivion as my top priority. If there's anything in particular you'd like to see, let me know and as always I'll do my best to include it.
Once again, I'm sorry for letting you all fall behind on updates as to what's going on but I hope to change that once this move back home finally happens. There's a lot of changes here at AT that are in the process of happening as well which I will update you all on at another time. Until then, take care and have a great week :)
29 Comments
View All Comments
theteamaqua - Saturday, April 22, 2006 - link
i have GeForce 7800 GTX 256MB in SLI, with a P4 660, it runs great on 1280x1024, my LCD, i turn everything on to max, except Item,Acto fade , well, and 4x AA cant be turn on with HDR is another thing, but its slow enough in some area, so with 4xAA, its going to be even slowerEvan12345 - Monday, April 24, 2006 - link
I justn hope that the benchmarks are not all on the same CPU. I.E. Benching a 6200 on a FX-57. Does this mae any logical sense. If one wanted to know how his/her GPU would handle Oblivion then using a CPU liek a FX-60 would scew the results ince people with 6600GTs do not have $1000 cpus and 4gb or ram. I think Anand should Bench on High, mid and low range systems.High Range System: Stuff Best avalible 6800+ and higher
Mid Range: 3500+ Amd 2.2Ghz 1gb ram 7600gt, 7600gs x800gt, x800gto, x800gto2, 6600gt.
Low Range: 3000+ or 3200+ 512mb ram 6600 and lower + x700 and lower.
Also he should show the Grphic settings that sould be used to get a Playable framerate (30-35 FPS)
Evan12345 - Monday, April 24, 2006 - link
Damn, I didn't spell check sorry.eman7613 - Friday, April 21, 2006 - link
Neither ATI nor Nvidia are goign to make their chipsest compatible both ways. This is beacsue one simmple thing, and that is beacsue each one wants to make their chipset the most popular, beacuse its hceaper to produce the onboard cards!If it costs $8 to make the board w/ chip, and you sell it to the retailer for 50$ you make 42 dollars, a 4:25 ratio.
If it costs $140 and it sells to retailer for $330 you make 190 dollars, a 13:33 ratio. Which makes more money? the chips!
judmarc - Thursday, April 20, 2006 - link
1. If you were trying to decide between relatively darker and lighter shades for the floor stain, hope you picked the lighter - shows dusty footprints a lot less (as we learned after picking the slightly darker shade :( ).2. Do *not* expect work/decisions about the house to slow down much if at all for a year or more after you move in - believe me, there is always something to improve, fix, that somehow slipped through the cracks and needs to be taken care of, or (drum roll) landscaping. What *does* slow down, fortunately, is the cash outflow, from tens of thousands or thousands per week to hundreds, then to the occasional couple or few grand for the next big project.
customcoms - Wednesday, April 19, 2006 - link
I think this whole chipset/graphics card fiasco is going to be cleared up by the motherboard manufacturers. One of two things will happen:A) Drivers and/or bios support for the opposite dual card configuration will be implemented (Like the Uli chipset hack)
B) The uber high end boards will have TWO chipsets on them: one for driving SLI (nvidia) and one for drving crossfire (ATI); the boards bios will decide which chip to use (or for that fact their could be three bioses; one to boot&one for each chip) with which card.
At which point, nvidia and ATI will do one of two things:
A) Realize the futility and adopt a standard
B) Continue allowing boards to be sold with method B above so both companies reap the profits..not good for the end user (EXPENSIVE)
imterry - Wednesday, April 19, 2006 - link
Isn't the amount of legwork one has to do to buy/build a house preposterous? When my wife and I signed the paperwork for our 1st house back in 96, we were stunned at the pile of papers that we had to sign, initial, and date. What was really irritating was that withing 6 months, our mortgage had been transferred to a new company and we had more paperwork to fill out for that company.Good luck to you and your family as y'all get ready to move in! Keep your builder's info handy so that you can kick them around if/when defects start showing up. ;)
tynopik - Wednesday, April 19, 2006 - link
it won't just "potentially hurt its chipset sales" it will absolutely destroy themif you had a choice between
a) a chipset that only supported ATI
b) a chipset that supported BOTH ATI and NVIDIA
which one are you going to choose?
the only way for this to be resolved is for them to come to some mutual agreement to support each others boards
as long as one holds out, the other one HAS TO hold out also
nlhowell - Wednesday, April 19, 2006 - link
I'm a little disappointed, Anand; I woulda thought you a cat6 kinda guy...gigabit routers are already on the market...creathir - Wednesday, April 19, 2006 - link
Cat5e is the EXACT same cable, it just has not bee "rated" for Cat6. The other factor is, over the short distances that are inside a house (almost always < 150') it just does not become a factor. Cat6 cable is nice, and it is good to future proof yourself (any structured media guy will tell you that) sometimes it just does not make sense. Most structured media guys like to put FIBER to every drop. Anand may have Cat5e for the network, but maybe he put fiber in to every drop as well. It could be Anand just does not feel that Cat6 is really that important... honestly, how many home network need more than a gigabit lan? They do not generate that much traffic... even during a LAN party it would be almost impossible for them to max out the network. I suppose if you really wanted to you could... but really, whats the point?- Creathir