7900 gt in sli, worthwhile upgrade???
#1
Scooby Regular
Thread Starter
iTrader: (1)
Join Date: Aug 2005
Location: Manchester ish
Posts: 18,547
Likes: 0
Received 0 Likes
on
0 Posts
7900 gt in sli, worthwhile upgrade???
Good morning
I am thinking about updating my graphics set up.
I currently run 2 x gigabyte 6600gts in sli (these are the overclocked ones as standard) and whilst most games run okay, if I turn on full antialiasing and filtering etc I do get noticeable slowdown.
The max resolution I play games at is 1280 x 1024 so will I get a big difference in quality//framerate with the new setup at that resolution or would we be talking about a few fps??
Cheers for any advice
I am thinking about updating my graphics set up.
I currently run 2 x gigabyte 6600gts in sli (these are the overclocked ones as standard) and whilst most games run okay, if I turn on full antialiasing and filtering etc I do get noticeable slowdown.
The max resolution I play games at is 1280 x 1024 so will I get a big difference in quality//framerate with the new setup at that resolution or would we be talking about a few fps??
Cheers for any advice
#2
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
Well, I'm running 2x 512mb 7800s in sli, and run all games in 1600x1200 with full antialiasing, filtering, max draw distances etc (basically all detail settings maxed out), and I've not had any slowdown yet.
I expect that the 7900 in the same sli configuration will be even faster...
I expect that the 7900 in the same sli configuration will be even faster...
#3
Scooby Regular
Thread Starter
iTrader: (1)
Join Date: Aug 2005
Location: Manchester ish
Posts: 18,547
Likes: 0
Received 0 Likes
on
0 Posts
Sounds good to me then, will go ahead and do it. Can only play at a max of 1280 x 1024 though as my monitor doesnt support higher than that.
Next upgrade I think will be an x2 processor, should keep me fairly current for a year or 2.
Next upgrade I think will be an x2 processor, should keep me fairly current for a year or 2.
#4
#5
Scooby Regular
iTrader: (1)
Join Date: Dec 2002
Location: Surrey
Posts: 947
Likes: 0
Received 0 Likes
on
0 Posts
I used to have XFX 6800gt's in sli,and now have a XFX 7900gt with a volt mod...The 7900 gt is far batter than the 2 6800's.
I think at the resolution your playing at,sli is a overkill (my opinon).I play Battlefield 2 @1600 x 1200 with all the eye candy turned on..............Alan
I think at the resolution your playing at,sli is a overkill (my opinon).I play Battlefield 2 @1600 x 1200 with all the eye candy turned on..............Alan
#6
Scooby Regular
Join Date: Feb 2002
Location: here
Posts: 10,641
Likes: 0
Received 0 Likes
on
0 Posts
Running SLi at 1280 x 1024 is pretty pointless IMHO. I doubt you need more than 256MB either even with the settings maxed out. Spend the money on a single 7900GT or GTX instead.
Trending Topics
#8
Scooby Regular
Thread Starter
iTrader: (1)
Join Date: Aug 2005
Location: Manchester ish
Posts: 18,547
Likes: 0
Received 0 Likes
on
0 Posts
yeah tried the toms hardware site to check their interactive vga charts, however they havent updated it yet to include the 7900 gt.
What resolution do you think sli becomes beneficial at then? I can certainly tell the difference if I run 1 6600gt or both in sli, even with effects turned off (includes some older games like microsoft rallisport challenge), the other games I mainly play are NFSU 1 and 2 and GTA San Andreas
What resolution do you think sli becomes beneficial at then? I can certainly tell the difference if I run 1 6600gt or both in sli, even with effects turned off (includes some older games like microsoft rallisport challenge), the other games I mainly play are NFSU 1 and 2 and GTA San Andreas
#9
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
With the 512mb 7800s, I can tell the difference in 1600x1200 when I turn sli off in some games (B&W2, F.E.A.R, and GT Legends).
#10
Check out the ATI X1900XT and XTX, they're superb cards - good enough for me to defect to ATI after years of running nVidia cards. From the benchmarks and reviews I studied before buying one, the case for a 7900GT/GTX SLI solution was less than clearcut, even before price of a second card entered the equation. With a couple of titles it works well, but in others it made little difference and in some cases even slowed things down. A couple of months ago the X1900XTX was the best single card solution on the market.
Gary.
Gary.
#11
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
Interesting. Where did you read that Gary?
I have to say, I haven't encountered a single game (or benchmark) where the sli config hasn't sped it up considerably...
I have to say, I haven't encountered a single game (or benchmark) where the sli config hasn't sped it up considerably...
#12
Iain,
The benchmarks are in this toms hardware article:
http://tomshardware.co.uk/2006/03/09...nch_mayhem_uk/
In the synthetic benchmarks there was a sizeable advantage to SLI (or ATI's crossfire equivalent), but F.E.A.R. aside, in the games tests things are different.
The consensus of opinion from various BBS's, gaming mags seemed to point in favour of the ATI card over nVidia, but the whole ATI vs nVidia debate often resembles a holy war! The clincher for me was that the ATI card gave the best results for Oblivion, and the desire to run that game at the highest detail settings was what prompted the upgrade in the first place.
Gary.
The benchmarks are in this toms hardware article:
http://tomshardware.co.uk/2006/03/09...nch_mayhem_uk/
In the synthetic benchmarks there was a sizeable advantage to SLI (or ATI's crossfire equivalent), but F.E.A.R. aside, in the games tests things are different.
The consensus of opinion from various BBS's, gaming mags seemed to point in favour of the ATI card over nVidia, but the whole ATI vs nVidia debate often resembles a holy war! The clincher for me was that the ATI card gave the best results for Oblivion, and the desire to run that game at the highest detail settings was what prompted the upgrade in the first place.
Gary.
#14
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
I may be missing something, but according to that review, the sli benchmarks were all coming out faster.
I know that in the real world, the sli card configuration runs stuff much quicker than if I put it into single card mode. All the games I'm running on the pc at the moment (B&W2, Fear, GT Legends, Dungen Siege, The movies, Battlefield 2 etc) demonstrate this quite noticeably...
Whether it's worth the extra expense is another question altogether
I know that in the real world, the sli card configuration runs stuff much quicker than if I put it into single card mode. All the games I'm running on the pc at the moment (B&W2, Fear, GT Legends, Dungen Siege, The movies, Battlefield 2 etc) demonstrate this quite noticeably...
Whether it's worth the extra expense is another question altogether
#16
Scooby Regular
Thread Starter
iTrader: (1)
Join Date: Aug 2005
Location: Manchester ish
Posts: 18,547
Likes: 0
Received 0 Likes
on
0 Posts
Originally Posted by GCollier
Check out the ATI X1900XT and XTX, they're superb cards - good enough for me to defect to ATI after years of running nVidia cards.
Gary.
Gary.
Also a few years ago I had a 9800 pro with an xt bios, I found the driver support although vastly improved over what it was still to be very buggy with certain games and I would find myself changing drivers to play an older game, with nvidia I have never experienced that (not to say it doesnt happen)
Originally Posted by MJW
You'd be better off buying 1 x 7900 and using the rest of the money towards a decent 1600x1200 screen.
#17
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
I'm using a Samsumg SyncMaster 214T. It's a very nice piece of kit. There may be better ones out there now as I've had mine a little while, but I'm extremely happy with it...
#18
Scooby Regular
Thread Starter
iTrader: (1)
Join Date: Aug 2005
Location: Manchester ish
Posts: 18,547
Likes: 0
Received 0 Likes
on
0 Posts
Originally Posted by Iain Young
I'm using a Samsumg SyncMaster 214T. It's a very nice piece of kit. There may be better ones out there now as I've had mine a little while, but I'm extremely happy with it...
#21
Originally Posted by Iain Young
I may be missing something, but according to that review, the sli benchmarks were all coming out faster.
Far Cry - 1024x768, no AA or AF - Single 7900GT marginally faster than SLI. Same with Doom 3 on those settings. Similar results with the 7900GTX in this Doom 3 benchmark.
Half-Life Lost Coast at 1024x768 - a single 7900GT beats the SLI setup both with and without AA/AF. Same for the 7900GTX, which also beats the SLI setup at 1600x1200 both with and without AA/AF.
Of course there are other benchmarks where the SLI solution is well ahead, but not on all games, and generally only at huge resolutions.
Gary.
#22
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
I see what you mean, (I was comparing it with the GT). I suspect the result is mainly down to the fact that Far Cry and Lost coast are both pretty old games now and don't use any of the features that the sli solution addresses, (I think they both use dx8). I would expect any game written for DX9.0c or the forthcoming dx10 to show significant improvements with the sli config.
I find it a little strange that they are still using old software to benchmark these new cards...
I find it a little strange that they are still using old software to benchmark these new cards...
#23
Scooby Regular
Join Date: Feb 2002
Location: here
Posts: 10,641
Likes: 0
Received 0 Likes
on
0 Posts
Originally Posted by Iain Young
I suspect the result is mainly down to the fact that Far Cry and Lost coast are both pretty old games now and don't use any of the features that the sli solution addresses, (I think they both use dx8).
#24
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
Far cry was released in 2004 - long before dx9.0c (which supports the latest stuff). It's an old game.
Last edited by Iain Young; 04 June 2006 at 07:57 PM.
#25
Scooby Regular
Join Date: Feb 2002
Location: here
Posts: 10,641
Likes: 0
Received 0 Likes
on
0 Posts
Originally Posted by Iain Young
Far cry was released in 2004 - long before dx9.0c (which supports the latest stuff). It's an old game.
#26
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
Was it? Blimey
My copy of Far cry installs 9.0 though, (I tried it last night). Is it just me, or has that game seriously dated, (especially when you compare it to the current releases).
My comment still stands though. Why benchmark cards designed for dx10 etc with software written 2-3 years ago. Doesn't make sense to me...
My copy of Far cry installs 9.0 though, (I tried it last night). Is it just me, or has that game seriously dated, (especially when you compare it to the current releases).
My comment still stands though. Why benchmark cards designed for dx10 etc with software written 2-3 years ago. Doesn't make sense to me...
#27
Scooby Regular
Join Date: Feb 2002
Location: here
Posts: 10,641
Likes: 0
Received 0 Likes
on
0 Posts
Originally Posted by Iain Young
My comment still stands though. Why benchmark cards designed for dx10 etc with software written 2-3 years ago. Doesn't make sense to me...
#28
Scooby Regular
Join Date: Sep 1999
Location: Swindon, Wiltshire Xbox Gamertag: Gutgouger
Posts: 6,956
Likes: 0
Received 0 Likes
on
0 Posts
I read that the 7900 has been designed to support the new features being put into dx10, and when it is finally released, a quick driver update will be all that is needed. Same with the latest Ati card.
Even if it doesn't, it still seems weird to me that they are testing high-end cards like this using old software which doesn't particularly stretch the capabilities of the cards at all.
Even if it doesn't, it still seems weird to me that they are testing high-end cards like this using old software which doesn't particularly stretch the capabilities of the cards at all.