Best Graphics Card for TS7.1

About Truespace Archives

These pages are a copy of the official truespace forums prior to their removal somewhere around 2011.

They are retained here for archive purposes only.

Best Graphics Card for TS7.1 // Archive: Tech Forum

1  |  

Post by xmanflash // May 4, 2006, 3:37pm

xmanflash
Total Posts: 335
Hi Folks, sorry - I searched and coulnt find much..


I need to buy a budget DirectXc compliant card.. am thinking of an Nvidia 6600GT - will this be fast enough for TS7.1 or should I look for something beefier to be able to use it properly?

Post by TomG // May 4, 2006, 4:15pm

TomG
Total Posts: 3397
Plenty fine! I have a regular 6600, no GT and works fine for me :)


HTH!

Tom

Post by scapino // May 5, 2006, 5:42am

scapino
Total Posts: 101
pic
I have the 6600GT as well, and it works SWELL. :)


Kurt

Post by splinters // May 5, 2006, 11:33am

splinters
Total Posts: 4148
pic
I have a Radeon 9800 Pro 256MB. I realise this is not what you are looking to get but bear in mind that with this I get silky smooth display but no shadow filtering and only 3xAA...:)

Post by stoker // May 5, 2006, 11:53am

stoker
Total Posts: 506
As from my experience I am never going to buy a NVIDIA Card again :mad::mad:(bloody things.............):(:) I think an ATI is far the best and i am currently searching for 1:confused:

Post by xmanflash // May 5, 2006, 5:02pm

xmanflash
Total Posts: 335
As from my experience I am never going to buy a NVIDIA Card again :mad::mad:(bloody things.............):(:) I think an ATI is far the best and i am currently searching for 1:confused:

I have to agree - I love ATI.. HOWEVER..

I found a post by a very clever bloke called Jan on the Matrox forums who has amachine similar to mine, with 5 SCREENS running on it!

Check this out! - http://home.wxs.nl/~jarkest/fs/4thsetup.html 4PC's and 10 monitors... for a Flightsim setup!

He has a Matrox APVe triple head running in 1 SLI slot and a Geforce 6800GT in the second, obviously not in SLI mode!

So I took my machine to a mates last night, who hapened to have an SLI version of my motherboard (Abit KN8-SLI) and a spare Geforece 6800GS video card (the cheaper and almost as good version of the older 6800GT) so we swapped the board - set it all up and tried it out.

It works great! - I have a problem switching between the different outputs, so am using 3 LCD monitors at the moment.. but I basically have two hardware profiles set at boot, that I select, one with only the 6800 switched on, which then gives me great 3D performance in Truespace - although I am yet to become familier with the Player interface. I can also play games again on my newest rig!! yay! :banana: :banana:

The 6800GS got 5250 score in 3Dmark05 which is good, but there are a couple of anomalies..

The Geforce is running in PCIe-8X mode, (it should be 16x) I think because the Matrox card is not a real PCIe - it has a PCIe bridge to basically what turns out to be an AGP card!!!, so I suspect the Matrox to be holding the Geforece back.. - still - I can now video edit in HDV realtime, and with a reboot, do 3D with Truespace adn Hexagon (which now works).

Another weird thing but may be helpful.. Modelling in TS7.1 on the Matrox APVe was super slow.. I dont know why, but everything took ages and caused me some of my earler vents on the forums. However I upgrade the cards BIOS to vers 17 and also the latest drivers and now Truespace Modeller mdoe actually flys along - the proper textures show and everythigg! - so thats also a bonus.. Player mode still does not work as the card has no DX9 support of any kind.

Hope that all helps..

PM me if you want more details...

Post by Alien // May 5, 2006, 6:25pm

Alien
Total Posts: 1231
pic
As from my experience I am never going to buy a NVIDIA Card again :mad::mad:(bloody things.............):(:) I think an ATI is far the best and i am currently searching for 1:confused:

If you don't care about Shader Model 3.0, & could live with having just SM2.0b, then I'd say an X800 GTO would be your best bet.


It works great! - I have a problem switching between the different outputs, so am using 3 LCD monitors at the moment.. but I basically have two hardware profiles set at boot, that I select, one with only the 6800 switched on, which then gives me great 3D performance in Truespace

What's this problem with switching between outputs? Please describe what you are trying to do & what happens [or doesn't].


The Geforce is running in PCIe-8X mode, (it should be 16x) I think because the Matrox card is not a real PCIe - it has a PCIe bridge to basically what turns out to be an AGP card!!!, so I suspect the Matrox to be holding the Geforece back.. - still - I can now video edit in HDV realtime, and with a reboot, do 3D with Truespace and Hexagon (which now works).

Nope, it's the motherboard [I see someone didn't read my little lesson on PCI-E in that other thread the other day :p]. Basically, the short version is that most SLI boards have a max of 20 SLI lanes total provided by their chipset, but you don't get the extra 4, just the 16. Put 1 x16 card in & it'll run in x16 mode no problem. Put a second card in & the motherboard splits the number of PCI-E lanes equally between the 2 slots, hence x8 mode.

Post by xmanflash // May 5, 2006, 9:57pm

xmanflash
Total Posts: 335
What's this problem with switching between outputs? Please describe what you are trying to do & what happens [or doesn't].

I have 2 DELL LCD monitors, both with dual Analog and DVI inputs so in setting up the cards to work, one in my C: partition, and the other in the D: XP partition, I have the Parhelia (PCIe 1) wired into analog1 and dvi 2 and the geforce into dvi1 and analog 2 (opposite).

The only problem is, the boot menu only shows on the pimary PCIe card (Parhelia) and on monitor 1, so it means when I boot with the screens switched to the Geforce setup, I may miss the boot menu, which is a pain..

Ideally I need a dual (DVI+RGB) hardware switcher device to plug both cards into, and have one set of signals come out...

Still - its a minor quibble.


Nope, it's the motherboard [I see someone didn't read my little lesson on PCI-E in that other thread the other day :p]. Basically, the short version is that most SLI boards have a max of 20 SLI lanes total provided by their chipset, but you don't get the extra 4, just the 16. Put 1 x16 card in & it'll run in x16 mode no problem. Put a second card in & the motherboard splits the number of PCI-E lanes equally between the 2 slots, hence x8 mode.

Oops - I have to stop speed reading, sorry!.. OK.. so the Geforce is running on 8x and that is correct, now why would my scores in 3Dmark06 be coming in at half the speed of all equivalent machines.. I had assumed it was maybe related to the 16x vs 8x thing but if not then I have another issue to try to track down.. for an extra $400 aussie dollars I have a great 3D rig as well as a great HDV editing rig.. no complaints here!

Post by Alien // May 6, 2006, 10:52am

Alien
Total Posts: 1231
pic
I have 2 DELL LCD monitors, both with dual Analog and DVI inputs so in setting up the cards to work, one in my C: partition, and the other in the D: XP partition, I have the Parhelia (PCIe 1) wired into analog1 and dvi 2 and the geforce into dvi1 and analog 2 (opposite).


The only problem is, the boot menu only shows on the pimary PCIe card (Parhelia) and on monitor 1, so it means when I boot with the screens switched to the Geforce setup, I may miss the boot menu, which is a pain..


Ideally I need a dual (DVI+RGB) hardware switcher device to plug both cards into, and have one set of signals come out...

Why don't you just have 1 monitor per card? :confused: You can still use them both at the same time, drag stuff from 1 to the other, & you don't need to fart around with hardware profiles [unless you like making things more complicated than they need to be :p].


Is what you've done what you thought I meant when I said I used that old Radeon & my old VD3 together? I've never used the Hardware Profiles feature myself, not even once - never found a need to. I'm sure it has its uses.... I just haven't come across 1 yet. ;)


Oops - I have to stop speed reading, sorry!.. OK.. so the Geforce is running on 8x and that is correct, now why would my scores in 3Dmark06 be coming in at half the speed of all equivalent machines.. I had assumed it was maybe related to the 16x vs 8x thing but if not then I have another issue to try to track down.. for an extra $400 aussie dollars I have a great 3D rig as well as a great HDV editing rig.. no complaints here!

You've already answered the question for yourself, it's running with half the bandwidth [8 lanes instead of 16].

Post by xmanflash // May 6, 2006, 8:30pm

xmanflash
Total Posts: 335
Why don't you just have 1 monitor per card? :confused: You can still use them both at the same time, drag stuff from 1 to the other, & you don't need to fart around with hardware profiles [unless you like making things more complicated than they need to be :p].


Is what you've done what you thought I meant when I said I used that old Radeon & my old VD3 together? I've never used the Hardware Profiles feature myself, not even once - never found a need to. I'm sure it has its uses.... I just haven't come across 1 yet. ;)




No - that wont work (well theoretically), I need the Matrox card to be the primary card to get the HDV video overlay working properly, and the 6800GS video card to be the proimary card to get DirectD/OpenGL working - the system uses whatever is in the primary slot/primary window (1) to determine what it can process behind the scenes (i.e. Overlay and 3D) otherwise that would be a great solution!



You've already answered the question for yourself, it's running with half the bandwidth [8 lanes instead of 16].


It appears I was wrong.. 3Dmark06 was benching against 3DMark05 when I checked it on the online score system, so now having aligned the versions properly, I am running at the same speed as others with systems like me, so its all good!


Thanks for your invaluable help - I'll get down to some modelling - I have a manual arriving via international mail soon!

Post by Alien // May 7, 2006, 1:39am

Alien
Total Posts: 1231
pic
No - that wont work (well theoretically), I need the Matrox card to be the primary card to get the HDV video overlay working properly, and the 6800GS video card to be the proimary card to get DirectD/OpenGL working - the system uses whatever is in the primary slot/primary window (1) to determine what it can process behind the scenes (i.e. Overlay and 3D) otherwise that would be a great solution!
If it's just a question of it knowing which is primary, you can switch that in display properties [right click desktop, properties, Settings tab]. Click on the secondary display [if you're not sure which is which ATM, click the identify button], turn on the checkbox for "Use this device as the primary monitor", click apply, & it's changed. Having said that, it might switch the order [right to left instead of left to right] of where the cursor moves from 1 to the other, but you can fix this by dragging those numbered rectangles around into the right position, then click Apply.

I'm not familiar with HDV overlay, but TBH it sounds like it might be simpler to sell both Matrox & 6800 & get something like an X1800 [you might even find 1 with dual DVI]. I know the X1xxx range have a bunch of advancements over earlier ATI cards, some to do with video, so maybe it'd do what you want. Have a read of this article (http://enthusiast.hardocp.com/article.html?art=ODIyLDEsLGhuZXdz), especially this page (http://enthusiast.hardocp.com/article.html?art=ODIyLDcsLGhuZXdz) [I think the relevant part for you would be the Avivo section about 2/3rds of the way down the page]. If it sounds like an X1xxx series card would do what you want [in terms of replacing whatever the Matrox does], then that might be something worth thinking about.
<edit>
You could even email ATI, just to make sure it'll do what you want - or ask on the rage3d.com forums, as aside from a number of people who know more about ATI than I do there's even a few actual ATI employees that post there from time to time.
</edit>

It appears I was wrong.. 3Dmark06 was benching against 3DMark05 when I checked it on the online score system, so now having aligned the versions properly, I am running at the same speed as others with systems like me, so its all good!
Ah, right - yeah, scores do tend to drop dramatically from 1 version to the next - I haven't even bothered with the '06 version.

Thanks for your invaluable help - I'll get down to some modelling - I have a manual arriving via international mail soon!
Glad I could do my little bit. :)

Post by parva // May 7, 2006, 2:12am

parva
Total Posts: 822
pic
Hi Folks, sorry - I searched and coulnt find much..


I need to buy a budget DirectXc compliant card.. am thinking of an Nvidia 6600GT - will this be fast enough for TS7.1 or should I look for something beefier to be able to use it properly?


I buyed me a Leadtek 6600GT some time ago as my MSI 6800 burned down.

Runs fine and swithout problems, Fan is also very quit.


For the time between I borrowed me a ATI 9800 card from my friend and even with this card trueSpace works well but I couldn't use smooth Shadows, Antialias and Supersampling in Player.


I'm not a fan boy but the past years I had no problems with Nvidia cards and the 6800 was very good but what I think all cards in this area and above get is the problem of extremly high temperature and a loss of energy-performance.

I have read that current GPU's use still more transistors as a CPU use, don't know but hope they would be more effective and use some inteligent tech like CPU AMDs speed step or alike.

Post by Alien // May 7, 2006, 2:27am

Alien
Total Posts: 1231
pic
I'm not a fan boy but the past years I had no problems with Nvidia cards and the 6800 was very good but what I think all cards in this area and above get is the problem of extremly high temperature and a loss of energy-performance.

I have read that current GPU's use still more transistors as a CPU use, don't know but hope they would be more effective and use some inteligent tech like CPU AMDs speed step or alike.

I don't know about CPU transistor counts, but as far as GPUs & temps go...

from that article on the HardOCP site:

NVIDIA boasted heavily that their G70 GPU contained 302 million transistors. Well, ATI one-upped them on this front as the R520 (Radeon X1800 XT and XL) boasts a total of 321 million transistors. The Radeon X1600 has 157 million transistors and the X1300 breaks the 100 million transistor mark. Interestingly, the X1000 series also use something called Dynamic Voltage Control technology that allows the voltages to be scaled up or down automatically as dictated by the load put on the GPU, which can reduce heat and power consumption when the card is not under heavy load.

Post by parva // May 7, 2006, 2:30am

parva
Total Posts: 822
pic
ah thanks Alien. That sounds interesting. Have you the link to the full articel?

btw. here a picture from the destroyed geforce 6800 *ouch*
http://www.parva-project.de/stuff/gpu1.jpg

Post by Alien // May 7, 2006, 2:42am

Alien
Total Posts: 1231
pic
ah thanks Alien. That sounds interesting. Have you the link to the full article?

Yup. Scroll up to post #11 for a link to the version divided into pages, or use this link (http://enthusiast.hardocp.com/articleprint.html?art=ODIy) for the all-at-once printable version. :D


btw. here a picture from the destroyed geforce 6800 *ouch*

How'd you manage that? Did you overclock it, or was it just faulty?

Post by parva // May 7, 2006, 2:56am

parva
Total Posts: 822
pic
Yup. Scroll up to post #11 for a link to the version divided into pages, or use this link (http://enthusiast.hardocp.com/articleprint.html?art=ODIy) for the all-at-once printable version. :D


uups :D many thanks.


How'd you manage that? Did you overclock it, or was it just faulty?


No, I have it never overclocked. I sended it back to MSI. Faulty, maybe but it run now more than a year without problems...

Post by Délé // May 7, 2006, 12:47pm

Délé
Total Posts: 1374
pic
I've got really mixed feelings with Nvidia right now myself. I have a 6800 ultra and it does work really well with TS7, but for me only the 78.01 drivers work properly. I tried updating to 84.21 but noticed that when I turned on the bloom and glow I got dark lines shooting off of the models (image below). I don't think that most people had that problem but I'm sticking with 78.01 myself. Apparently they can make good cards, but they forgot how to write drivers. There's no way to contact them either as their contact area of the website has been "under construction" since I've had the card. Grrr.


Now, I just got Hexagon 2 last week. I noticed that I can't model with that program because when I try to move or rotate edges or points, the program freezes for 15 or 20 seconds, blacks out, and then comes back. It's nearly impossible to model. I contacted tech support and what did they say? (big surprise)... Nvidia driver problems, 'doh. I tried updating to the 84.21 drivers again just to see if they would work better with Hexagon but the problem persists.


So I'm back to 78.01 (again) and I have one unusable program on my machine due to Nvidia drivers. Hopefully no other programs will require me to update to the newer drivers until they can write a stable one.


Please forgive my venting. It's been building for a while. :)

Post by xmanflash // May 7, 2006, 2:14pm

xmanflash
Total Posts: 335
I've got really mixed feelings with Nvidia right now myself. I have a 6800 ultra and it does work really well with TS7, but for me only the 78.01 drivers work properly. I tried updating to 84.21 but noticed that when I turned on the bloom and glow I got dark lines shooting off of the models (image below). I don't think that most people had that problem but I'm sticking with 78.01 myself. Apparently they can make good cards, but they forgot how to write drivers. There's no way to contact them either as their contact area of the website has been "under construction" since I've had the card. Grrr.

Now, I just got Hexagon 2 last week. I noticed that I can't model with that program because when I try to move or rotate edges or points, the program freezes for 15 or 20 seconds, blacks out, and then comes back. It's nearly impossible to model. I contacted tech support and what did they say? (big surprise)... Nvidia driver problems, 'doh. I tried updating to the 84.21 drivers again just to see if they would work better with Hexagon but the problem persists.

So I'm back to 78.01 (again) and I have one unusable program on my machine due to Nvidia drivers. Hopefully no other programs will require me to update to the newer drivers until they can write a stable one.

Please forgive my venting. It's been building for a while. :)

Im using the official 84.43 drivers with my 6800GS and using Hexagon 2 so far with no problems - havnt done much with it though..

When I complained on the DAZ bugs list about Hexagon 2 not working on my Matrox APVe card, they wrote back that they only recommend ATI or Nvidia cards! (I think Matrox worked out that Hexagon2 probably uses OpenGL 1.5, and Matrox only support up to 1.2.. - I Have to give it to Matrox tech support they are excellent on the forums..)

Post by prodigy // Apr 4, 2007, 12:07pm

prodigy
Total Posts: 3029
pic
I recomends use a PCI Trident with 2 mb. 120 frames per second on the player side with shadows enabled, bloom, 1280x960pixels 85mhz ....


Amazing vga..


Or you can buy a 6600 like the mayor part of us.. ;)

Post by Alien // Apr 4, 2007, 1:10pm

Alien
Total Posts: 1231
pic
I recomends use a PCI Trident with 2 mb. 120 frames per second on the player side with shadows enabled, bloom, 1280x960pixels 85mhz ....


Amazing vga..

Dude, what have you been smoking?! http://homepage.ntlworld.com/alien42/smilies/rasta.gif I doubt you'd even be able to install XP with a 2MB Trident, nevermind trueSpace!


Or you can buy a 6600 like the mayor part of us.. ;)

Ha! I'd much rather have an ATI. :D

Post by prodigy // Apr 4, 2007, 3:49pm

prodigy
Total Posts: 3029
pic
i dont know what are you talking about.. i ran that triden on vista without problems....


Sure when i wake up my wife kicks me out from my bed! hehehe..;)

Post by xmanflash // Apr 4, 2007, 9:22pm

xmanflash
Total Posts: 335
If you need any spare bits for your trident, I have a spare 2M version in my storage somewhere, but let me warn you, it will be worth top dollar :D


It was really fast in my 386 about 40 years ago!.

Post by Jack Edwards // Apr 4, 2007, 10:41pm

Jack Edwards
Total Posts: 4062
pic
Ahh.... those were the days. I remember my 2MB Trident fondly... :D


-Jack.

Post by Garion // Apr 5, 2007, 10:04am

Garion
Total Posts: 116
pic
I have an old 512K Trident in a machine in the loft (I collect old PC's and computers) My wife just calls it junk :)


In my new PC I have 2x 7950GT,512Mb Geforce cards in SLI mode.. I now find that my render is done before I even start to model :D


Cheers


Garion

Post by prodigy // Apr 5, 2007, 10:09am

prodigy
Total Posts: 3029
pic
Thats true...


HO!! i remember, i found a very very nice url.. wanna see the next gen of real time lighting??



Vray will be like that old tridents very soon...


Check this out.. (http://www.geomerics.com/)



Real time Radiosity, Soft Shadows, GI... all in real time.. only need caustics.. ;)


And here is the video page.. (http://www.geomerics.com/index.php?page=lighting)
Awportals.com is a privately held community resource website dedicated to Active Worlds.
Copyright (c) Mark Randall 2006 - 2024. All Rights Reserved.
Awportals.com   ·   ProLibraries Live   ·   Twitter   ·   LinkedIn