Offtopic Any topics not related to the games we cover. Doesn't mean this is a Spam-fest. Profanity is allowed, enter at your own risk. |
|
|
Administrator
Posts: 17,739
Join Date: Apr 2002
Location: Camp Crystal Lake
|

05-27-2004, 07:58 AM
Yeah, that's what I'm guessing. A vid card upgrade and some more RAM and I should be set.
Mind you, every goddamned penny these days is going towards flowers, candles, decorations, tuxedos, catering, etc etc etc for the wedding in October. I probably won't upgrade my rig 'til next Winter.
Owned.
|
|
|
 |
|
|
General of the Army
Posts: 18,202
Join Date: Jan 2002
Location: Ireland
|

05-27-2004, 10:54 AM
256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.
Quote:
Originally Posted by Nyck
But one of her fucking grandkids, pookie, rayray or lil-nub was probably slanging weed or rocks out of the house.
|
|
|
|
 |
|
|
Senior Member
Posts: 5,158
Join Date: Jan 2002
Location: Gatineau, Qc, Canada
|

05-27-2004, 03:29 PM
Quote:
Originally Posted by Gerard
256 mb cards are pretty high end these days though a 512mb card will be available this year from both ati and nvidia. UNreal 3.0 engine according to tim sweeny will require a card with 1 gig of ram on it to run at max details, but thats a couple of years off so 1gig will probably be standard by then.
|
Even THEY have trouble getting smooth framerates at present... so by the
time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0
looks insanely detailed... can't wait to see what they'll come up with...
|
|
|
 |
|
|
1st Lieutenant
Posts: 4,657
Join Date: Jan 2002
Location: California, USA
|

05-27-2004, 03:51 PM
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.
|
|
|
 |
|
|
Colonel
Posts: 8,386
Join Date: Mar 2002
Location: wut
|

05-27-2004, 03:52 PM
Quote:
Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.
|
They tried that along time ago. My old S3 video card had upgradable RAM.
|
|
|
 |
|
|
General of the Army
Posts: 18,202
Join Date: Jan 2002
Location: Ireland
|

05-27-2004, 03:57 PM
Quote:
Originally Posted by intrestedviewer
Why does the vid card need its own memory? Why cant it use the computer memory oOo: maybe they should make video card memory upgradable.
|
It can use the pc memory but since its not attached to the card theres a lag involved in the gfx card sending instructions to it, that and most likely the game will be using a load of memory anyway. Would make for choppy gameplay.
Upgradable gfx cards would be a good idea these days, dont really know how it could be done though since the memory chips are soldered to the board. Then thers the question of mixing slower ram timings on the board and shit, be more hassle than enough id say.
Quote:
Originally Posted by Nyck
But one of her fucking grandkids, pookie, rayray or lil-nub was probably slanging weed or rocks out of the house.
|
|
|
|
 |
|
|
2nd Lieutenant
Posts: 3,811
Join Date: Apr 2002
Location: Redmond, Home of Microsoft
|

05-27-2004, 03:57 PM
[quote:a381b]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...[/quote:a381b]
Dual shotgun PCIX x800xt 512.
|
|
|
 |
|
|
Senior Member
Posts: 8,792
Join Date: Apr 2002
Location: Hans-AlbinVonReitzenstein
|

05-27-2004, 03:59 PM
no
|
|
|
 |
|
|
General of the Army
Posts: 18,202
Join Date: Jan 2002
Location: Ireland
|

05-27-2004, 04:00 PM
Quote:
Originally Posted by Miscguy
[quote:1c804]Even THEY have trouble getting smooth framerates at present... so by the time the engine is utilized, we'll have those 512/Gig video cards. Unreal 3.0 looks insanely detailed... can't wait to see what they'll come up with...
|
Dual shotgun PCIX x800xt 512.[/quote:1c804]
I really think thats more of a gimmick than anything else, from the article i read it boosts performance by around 40%, if both cards are only doing half the work then why not around at least a 90% boost? Obviously the cards were not designed with a version of sli in mind so this alienware thnig seems to be more of a hack job that ends up losing performance.
Quote:
Originally Posted by Nyck
But one of her fucking grandkids, pookie, rayray or lil-nub was probably slanging weed or rocks out of the house.
|
|
|
|
 |
|
|
2nd Lieutenant
Posts: 3,811
Join Date: Apr 2002
Location: Redmond, Home of Microsoft
|

05-27-2004, 04:10 PM
I wasnt even refering to the alienware gimmick. I just think that with what games can do these days that the old school way of dual shot gunning graphics cards should return. Especially with the new PCIX interface. Of course with the astronomical cost of a single card, running 2 would bankrupt some small countries let alone even the hardest of the hardcore gamers.
Now with alienwares hack job theres a 40% or so increase in proformace (i'll take your word seeing how i have seen no stats), i wonder about bottlenecks in other areas. That and the fact it is a hack job not something implimented from the get go you're naturally not going to see the increase in proformance you would expect. Frankly i think if major graphics developers were to look into this and impliment there own solutions into next generation lines you would see the proformance.
|
|
|
 |
|
|
Brigadier General
Posts: 10,721
Join Date: Apr 2003
Location: C-eH-N-eH-D-eH eH?
|

05-27-2004, 04:12 PM
I vote to bring bakc multiple vpu's on a video card.
edit
|
|
|
 |
|
|
1st Lieutenant
Posts: 4,657
Join Date: Jan 2002
Location: California, USA
|

05-27-2004, 04:18 PM
VPU's
|
|
|
 |
|
|
General of the Army
Posts: 18,202
Join Date: Jan 2002
Location: Ireland
|

05-27-2004, 04:26 PM
Well last real multi vpu card was the voodoo 5 5500 and the 6000. Personally id hate to see nvidia attempt a multi vpu board, their boards are stupidly large as they are now almost on par with the 6k and they only have the one vpu on them. oOo:
I suppose its a matter of time before this happens again though, or at least a company realises sli is still very useful for people wanting more performance....and for the amount of 3d mark freaks who live to benchmark they fit into that catagory.
Quote:
Originally Posted by Nyck
But one of her fucking grandkids, pookie, rayray or lil-nub was probably slanging weed or rocks out of the house.
|
|
|
|
 |
|
|
Colonel
Posts: 8,386
Join Date: Mar 2002
Location: wut
|

05-27-2004, 05:09 PM
The XGI Volari Duo V8 has 2 vpu's and it sucks.
|
|
|
 |
|
|
Major General
Posts: 13,482
Join Date: Jun 2002
Location: University Park, PA
|

05-27-2004, 05:28 PM
fire2:
|
|
|
 |
Thread Tools |
|
Display Modes |
Linear Mode
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
Powered by vBulletin® Version 3.8.12 by ScriptzBin Copyright ©2000 - 2025, vBulletin Solutions Inc.
vBulletin Skin developed by: vBStyles.com
© 1998 - 2007 by Rudedog Productions | All trademarks used are properties of their respective owners. All rights reserved.
|