It probably only runs Cyrsis now, but likely back in the day wouldn't due to crappy Drivers. we spent a long time messing with drivers to do proper SLI back then. (We still kinda do but much easier these days)
Linus, I love your videos but, please, stop pressing whatever you are pressing in your pocket. It´s so distracting and breaks the focus on you. Maybe is just me, but I have to say it.
Great video! I never had dual Voodoo 2's in SLI, but I did have a Voodoo Banshee and a Voodoo 3 (still in my retro gaming PC). I had 3 Nvidia SLI systems. Dual 6600gt's, 7900gt's, and 8800gt's.
I've still got an sli system from 2009, duel 6600gt. had a little card you turned around for single or duel n shit. worked alright 3 years after I bought it and actually put a second card in it. Supreme Commander was actually playable can confirm.
.... the 9600 I bought another 2 years later for 60 bucks destroyed that set up however.... and DDR1 ram is actually really expensive...
basically If you want to stay ahead then WELCOME to who's pc is it anyway, where your tech is always outdated and the price doesn't matter anyway!
I have two GeForce 7900 GTX XFX 512 MB which I may be willing to sell for any historical fans out there ! Played lots of Battlefield 2 on them with eventually a well vented systems, antec 1200 ! :) Try contact me if interested.
it seems like it would be trivial to have each card render its own frame using its own VRAM and then have the master card interleave them. that process in and of itself might take a little bit of VRAM on the master side, and it certainly wouldn't make the VRAM stack, but it would certainly give you more benefit than just straight up mirroring. That just sounds dumb.
But there must be some reason. Maybe so you can share data between cards without giving up one of your two memory access channels (that is what DDR means right?) to let one card access the other card's VRAM (which would actually make the VRAM stack, at the expense of losing DDR), or taking the performance hit of letting it explicitly request? But...why would you need to do that?
oh wait I'm stupid data stays in vram between frames so it's helpful to know what the other card is doing. But if you're splitting the frame at some arbitrary scanline, you don't know or care what the other card is doing, at last not enough that you couldn't just let one card request data from the other one, instead of mirroring it all the time.
Can someone who knows better than I do please enlighten me? Tell me what I've forgotten? Or how much of what I just said is wrong (probably almost all of it)?
When those cards were new..... I was running on whatever on board ATI graphics chip that was in this Compaq computer I got with a AMD Sempron CPU single core.... 1.8ghz or 2ghz.... 512mb of ram! But hey the thing could run Half Life 2! pretty damn good for a computer I picked up in 2005! (maybe 04)
Before that I was running a AMD Duron.... that I replaced it's 8mb ATI Rage graphics card with an ATI..... forgot what generation and model.... but a low end 64mb card... Back when 128mb card were high end, and mine was a low end card..... (Miles above the ATI Rage)
Didn't get more serious and built my first Gaming PC in 2007 and AMD X2 4200+??? 2.4ghz, 4gb of ram, ATI Radeon X1650 Pro 512mb of Vram. (later upgraded to X2 6000+ 3ghz) And a mid level ATI 5000 series card.
Talking about old hardware just makes me nostalgic!
We need to go back to the time where more than 2 companies make major graphics cards. I know in the mobile sector there are more and probably some Asian only brands... We need to be able to not let miners control the card market.... We live in sad times....
https://youtu.be/qJtRA31AQAU?t=464 , Technically speaking, even a MX 440 can "run" Crysis with some config file changes. But to actually RUN Crysis in DX 10 with reasonable setting, you need at least a 8800. 6800 doesn't even support DX 10.
Why not mention the sli key on the motherboard to enable sli, that dfi board had the jumpers instead of the card thing but still had to swap stuff to get it working.
It probably only runs Cyrsis now, but likely back in the day wouldn't due to crappy Drivers.
ReplyDeletewe spent a long time messing with drivers to do proper SLI back then. (We still kinda do but much easier these days)
need a modern sli vs crossfire build and comparison
ReplyDeleteLinus, I love your videos but, please, stop pressing whatever you are pressing in your pocket. It´s so distracting and breaks the focus on you.
ReplyDeleteMaybe is just me, but I have to say it.
voodoo2s... Man were gettin old
ReplyDeleteGreat video! I never had dual Voodoo 2's in SLI, but I did have a Voodoo Banshee and a Voodoo 3 (still in my retro gaming PC). I had 3 Nvidia SLI systems. Dual 6600gt's, 7900gt's, and 8800gt's.
ReplyDeletePlease do more of these history focused videos.
ReplyDeleteI've still got an sli system from 2009, duel 6600gt.
ReplyDeletehad a little card you turned around for single or duel n shit.
worked alright 3 years after I bought it and actually put a second card in it. Supreme Commander was actually playable can confirm.
.... the 9600 I bought another 2 years later for 60 bucks destroyed that set up however.... and DDR1 ram is actually really expensive...
basically If you want to stay ahead then WELCOME to who's pc is it anyway, where your tech is always outdated and the price doesn't matter anyway!
Damn I'm old.
ReplyDeleteI remember when Intel was launching the Q6600 CPU and people were laughing at us for wasting money on a quad core CPU..... Seems like a lifetime ago.
I have two GeForce 7900 GTX XFX 512 MB which I may be willing to sell for any historical fans out there ! Played lots of Battlefield 2 on them with eventually a well vented systems, antec 1200 ! :) Try contact me if interested.
ReplyDeleteHey man, why do you keep reaching under the table like that? What exactly is going on down there that you can't keep our hand off it?
ReplyDeleteit can run crysis
ReplyDeletetake my money and my gtx 1060 rig and give me this bad ass
omg I had 6800 Ultra's in Sli. Guess I was baller back then lol.
ReplyDeletei like ur video man u soo god damn good in what you are doing
ReplyDeleteTaiwanese and Chinese? 这两个能并列了?
ReplyDeleteI thought he says sponsor by zantac.. Lol
ReplyDeleteA moment of silence for who don't get the crysis joke
ReplyDeleteCF !
ReplyDelete... speaking of not around anymore: TUNNELBEAR! :P
ReplyDeletewait...VRAM is cloned?
ReplyDeletewhy?
it seems like it would be trivial to have each card render its own frame using its own VRAM and then have the master card interleave them. that process in and of itself might take a little bit of VRAM on the master side, and it certainly wouldn't make the VRAM stack, but it would certainly give you more benefit than just straight up mirroring. That just sounds dumb.
But there must be some reason. Maybe so you can share data between cards without giving up one of your two memory access channels (that is what DDR means right?) to let one card access the other card's VRAM (which would actually make the VRAM stack, at the expense of losing DDR), or taking the performance hit of letting it explicitly request? But...why would you need to do that?
oh wait I'm stupid data stays in vram between frames so it's helpful to know what the other card is doing. But if you're splitting the frame at some arbitrary scanline, you don't know or care what the other card is doing, at last not enough that you couldn't just let one card request data from the other one, instead of mirroring it all the time.
Can someone who knows better than I do please enlighten me? Tell me what I've forgotten? Or how much of what I just said is wrong (probably almost all of it)?
When those cards were new..... I was running on whatever on board ATI graphics chip that was in this Compaq computer I got with a AMD Sempron CPU single core.... 1.8ghz or 2ghz.... 512mb of ram!
ReplyDeleteBut hey the thing could run Half Life 2! pretty damn good for a computer I picked up in 2005! (maybe 04)
Before that I was running a AMD Duron.... that I replaced it's 8mb ATI Rage graphics card with an ATI..... forgot what generation and model.... but a low end 64mb card... Back when 128mb card were high end, and mine was a low end card..... (Miles above the ATI Rage)
Didn't get more serious and built my first Gaming PC in 2007 and AMD X2 4200+??? 2.4ghz, 4gb of ram, ATI Radeon X1650 Pro 512mb of Vram. (later upgraded to X2 6000+ 3ghz)
And a mid level ATI 5000 series card.
Talking about old hardware just makes me nostalgic!
We need to go back to the time where more than 2 companies make major graphics cards. I know in the mobile sector there are more and probably some Asian only brands... We need to be able to not let miners control the card market.... We live in sad times....
ReplyDeletehttps://youtu.be/qJtRA31AQAU?t=464 , Technically speaking, even a MX 440 can "run" Crysis with some config file changes. But to actually RUN Crysis in DX 10 with reasonable setting, you need at least a 8800. 6800 doesn't even support DX 10.
ReplyDelete30fps is barely playable? lol someone make this guy use a normal people computer for a month
ReplyDeleteOooh this brings back memories.
ReplyDeleteI know this channel is kinda Intel and Nvidia sponsored, but I would love the same video but about crossfire, its interesting
ReplyDeleteLinus why do you talk in such a patronising way now, you used to be so down to earth...
ReplyDeleteIts says 21st of march but I'm watching this in the 20th lol
ReplyDeleteThat case is horrendous. I love it.
ReplyDeleteI like these videos. It's cool to see how hardware was back then.
ReplyDeleteAawww the memorys HEHE
ReplyDeleteNice Case XD
ReplyDeleteWhy not mention the sli key on the motherboard to enable sli, that dfi board had the jumpers instead of the card thing but still had to swap stuff to get it working.
ReplyDelete