Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → New computer ordered
12
New computer ordered
2006-02-07, 5:57 AM #41
[QUOTE=Cool Matty]I gave perfectly legit reasons for not getting either of the products you mentioned. You'll have to find a better reason than "I'm biased" to counter it.[/QUOTE]
"LOL d00d 8 yrs ago gigabyte made a bad motherboard. THEY SUX."
D E A T H
2006-02-07, 6:38 AM #42
P4/HT 3.0GHz
1GB RAM
200GB (3HDs)
256MB ATI RADEON 9600 SERIES
CD-RW
DVD-ROM


I need a DVD-Burner sometime.
twitter | flickr | last.fm | facebook |
2006-02-07, 6:43 AM #43
iMac G5 1.9 ghz
OSX 10.4.4
1 gig DDR2 RAM
160 gb SATA HD, 200 gb Firewire HD
ATI Radeon X600
Dual-layer DVD burner
2006-02-07, 8:37 AM #44
[QUOTE=Dj Yoshi]...

Are you ****ing with me? Or did I seriously just read this post? I mean, games being GPU limited...that's gotta be a joke. Please tell me that's a joke.

[/QUOTE]


So... how are you liking game running at 1024x768? Meanwhile, everyone else has moved on to higher, more GPU intensive resolutions. When we have decent PCs we like to run at higher resolutions, like 1600 x 1200, 1680 x 1050 and 1280 x 1024. And we even like to use AA and AF sometimes. :rolleyes:
2006-02-07, 8:55 AM #45
Originally posted by Obi_Kwiet:
So... how are you liking game running at 1024x768? Meanwhile, everyone else has moved on to higher, more GPU intensive resolutions. When we have decent PCs we like to run at higher resolutions, like 1600 x 1200, 1680 x 1050 and 1280 x 1024. And we even like to use AA and AF sometimes. :rolleyes:


I'm monitor-limited to 1280x1024. :(
woot!
2006-02-07, 9:00 AM #46
P4 2.8ghz
1 gig ram
250gb HDD
ATI Radeon x800 256MB
intel motherboard(cant be arsed to remember the model number)
nifity case with one of those ducts for venting PC fans and snap on snap off panel.
sony DVD drive
sony DVD-R/RW

oh yah and a 19" CRT thats freakin massive.
My girlfriend paid a lot of money for that tv; I want to watch ALL OF IT. - JM
2006-02-07, 9:00 AM #47
[QUOTE=Dj Yoshi]Keep telling yourself that slugger ;)[/QUOTE]

Oh I will, cause it's true. ;)

Don't underestimate a computer that has a few extra cards to help an overclocked CPU.
The right man in the wrong place can make all the difference in the world.

-G Man
2006-02-07, 9:01 AM #48
Originally posted by Obi_Kwiet:
So... how are you liking game running at 1024x768? Meanwhile, everyone else has moved on to higher, more GPU intensive resolutions. When we have decent PCs we like to run at higher resolutions, like 1600 x 1200, 1680 x 1050 and 1280 x 1024. And we even like to use AA and AF sometimes. :rolleyes:

Most people run their games at 1280 and below. The resolution where you start to see any change from videocard to videocard is 16x12, and even then it's usually not that great. You're really misinformed if you think games are GPU limited. Hell, most people don't even have monitors big enough for 16x12.

PS--I run everything at 1280x960 2xqaa 4xaf. Please, don't talk down to me, especially when you're not only wrong, but also know nothing about computers (quite obviously)

Originally posted by KnightRider2000:
Oh I will, cause it's true. ;)

Don't underestimate a computer that has a few extra cards to help an overclocked CPU.

What? Extra cards? It doesn't matter what "cards" you have, the 9600XT isn't that great of a video card. And your CPU is an Intel Celeron. Not exactly what I'd call "performance gaming CPU" at any speed.
D E A T H
2006-02-07, 9:09 AM #49
I play DOOM 3 on a Pentium 128

It works just fine
"it is time to get a credit card to complete my financial independance" — Tibby, Aug. 2009
2006-02-07, 9:24 AM #50
Gee, Yoshi, you know so much! Maybe you should drop in on Xbit labs and tell every one what they're doing wrong. If I look I think I can find a few other sites that would benefit from you knowledge.


http://www.xbitlabs.com/articles/cpu/display/28cpu-games_2.html

In these bench marks they did all their tests @ 1024 x 768 w/ no AA or AF except on Serious Sam. They also used medium or low settings. Half the games were not CPU limited even at these settings, and the ones that were, had such rediculously high frame rates that it would not make any differance anyone. (Oh noes, I'm getting only 400fps!)

In the one exception Serious Sam 2, the CPU seems to make a difference even with 2x AA and AF, still at medium quality and 1024 x 768 though. Interestingly enough, the 4800+ outperforms the FX57 despite the FX57's 400MHz advantage. The game is multithreaded.


So, in the future, Yoshi, think before you flame, especially if your arguing about video cards and flaming a geek who spends an hour a day browsing a forum dedicated to video cards.
2006-02-07, 9:30 AM #51
[QUOTE=Dj Yoshi]Are you ****ing with me? Or did I seriously just read this post? I mean, games being GPU limited...that's gotta be a joke. Please tell me that's a joke.[/QUOTE]Modern GPUs are responsible for every formerly processor-intensive tasks. Additionally, modern GPUs render scenes in parallel to the processor so it's a non-blocking operation.

The only thing the CPU has to do is set up the scene (non-blocking), run the physics simulation and the AI. Current physics and AI simulations are decidedly more primitive than the graphics in most games. And since the GPU is doing absolutely everything graphics-related: games are definitely... definitely GPU-limited.
2006-02-07, 9:55 AM #52
Originally posted by Jon`C:
Modern GPUs are responsible for every formerly processor-intensive tasks. Additionally, modern GPUs render scenes in parallel to the processor so it's a non-blocking operation.

The only thing the CPU has to do is set up the scene (non-blocking), run the physics simulation and the AI. Current physics and AI simulations are decidedly more primitive than the graphics in most games. And since the GPU is doing absolutely everything graphics-related: games are definitely... definitely GPU-limited.

In all but the latest games, the 7800GT and the 7800GTX perform almost indifferentially (and even then it's hard to tell a huge difference--10 frames maybe), when the only difference is slight clock difference and 4 pipelines, something that caused a HUGE disparity of performance issues in the 6800 vs 6800GT+, but not so much now.

I can KINDA see where he's going with this, but he just proved my point.

PS--Thanks for agreeing and proving my point Obi?

PSS, I spend all day browsing hardware news, forums, etc yet Jon still knows more about the scene than I will for a while. It's not about how much time you spend per day, it's about how long you've been doing it, how much you've gleaned from it, and how up-to-date you're kept. A perfect example is the recent hardware thread where SD_Rakishi said the 7800GT was well out of the pricerange of a 1000CAD PC. He would be right, at the release of the 7800GT, but now just 6 months later he's not. I only flame you because you're a little kid who thinks sarcasm gets his point across. Jon put it much more eloquently, and in a much better form than you ever would/could/will, and that's half of every argument/presentation.

This makes me somewhat of a hypocrite, but I think you get the point.
D E A T H
2006-02-07, 10:05 AM #53
[QUOTE=Dj Yoshi]In all but the latest games, the 7800GT and the 7800GTX perform almost indifferentially (and even then it's hard to tell a huge difference--10 frames maybe), when the only difference is slight clock difference and 4 pipelines, something that caused a HUGE disparity of performance issues in the 6800 vs 6800GT+, but not so much now.[/QUOTE]Because those extra pipelines on the 6800GT were used. Throwing extra pipelines, pixel or vertex shading units onto a card won't do anything unless the game you're playing requires it.

The 10 fps difference is the increased clockrate. If it were CPU-limited you wouldn't have seen any difference at all.
2006-02-07, 10:15 AM #54
Originally posted by Jon`C:
Because those extra pipelines on the 6800GT were used. Throwing extra pipelines, pixel or vertex shading units onto a card won't do anything unless the game you're playing requires it.

The 10 fps difference is the increased clockrate. If it were CPU-limited you wouldn't have seen any difference at all.

Not really, you'd just see diminished gains. As is quite evident.

And the pixel pipelines and vertex shaders should be readily put to use with games out there that use such advanced lighting and texture effects like FEAR, and Doom 3 engine games (mainly Q4 at the moment with ET:QW and Prey coming up), yet you didn't see that much of a difference. And if the clock speed makes such a difference, tell me why adding a good 15-20% performance boost on my 6800GT via RAM and GPU clock speeds doesn't net me much gain at all?
D E A T H
2006-02-07, 11:05 AM #55
[QUOTE=Dj Yoshi]Not really, you'd just see diminished gains. As is quite evident.[/QUOTE]No, you wouldn't see any.

Quote:
And the pixel pipelines and vertex shaders should be readily put to use with games out there that use such advanced lighting and texture effects like FEAR, and Doom 3 engine games (mainly Q4 at the moment with ET:QW and Prey coming up), yet you didn't see that much of a difference.
Again: If the game is written to use hardware features they will be used. If a game renders a scene using 8 textures per pass in 2 passes, even though a card supports 16 textures per pass, you aren't automatically gaining the use of those extra texture passes. Additionally, even though the 7800 GTX supports 24 textures per pixel per frame, it only supports 16 shaders - the same as the 7800 GT. The situation is much more complicated than "More pipelines = good"

Quote:
And if the clock speed makes such a difference, tell me why adding a good 15-20% performance boost on my 6800GT via RAM and GPU clock speeds doesn't net me much gain at all?
I'd guess that either your computer is CPU-limited or your video memory has poor latency. Even though the 6800GT reference board is an underclocked Ultra there is no set specification in place for the quality of components manufacturers will use.
Additionally, the 6800 Ultra clock rate is more or less up to the manufacturer as well. Can you overclock your 6800GT core to 450 MHz?
2006-02-07, 11:11 AM #56
Originally posted by Jon`C:
No, you wouldn't see any.

Again: If the game is written to use hardware features they will be used. If a game renders a scene using 8 textures per pass in 2 passes, even though a card supports 16 textures per pass, you aren't automatically gaining the use of those extra texture passes. Additionally, even though the 7800 GTX supports 24 textures per pixel per frame, it only supports 16 shaders - the same as the 7800 GT. The situation is much more complicated than "More pipelines = good"

I'd guess that either your computer is CPU-limited or your video memory has poor latency. Even though the 6800GT reference board is an underclocked Ultra there is no set specification in place for the quality of components manufacturers will use.
Additionally, the 6800 Ultra clock rate is more or less up to the manufacturer as well. Can you overclock your 6800GT core to 450 MHz?

Why wouldn't you see any?

The second bit makes a bit of sense...I think. I'm not sure what shaders have to do with textures, but okay.

My computer is a bit CPU limited, but it shouldn't be. My Video memory shouldn't have poor latency--GDDR3 memory in an 8x AGP slot...don't see what could be limiting it (the AGP aperture only has problems with handling stuff like SLI, iirc.). I could overclock it to 450 when it was brand new and the airflow in my case was excellent...anymore though, only like 420mhz.
D E A T H
2006-02-07, 11:24 AM #57
If you want to see what CPU-limited is, go do benchmarks with Quake III. Video cards nowadays are so flippin fast that they could probably run the entire game right off the card, if it were possible.
2006-02-07, 11:30 AM #58
[QUOTE=Cool Matty]If you want to see what CPU-limited is, go do benchmarks with Quake III. Video cards nowadays are so flippin fast that they could probably run the entire game right off the card, if it were possible.[/QUOTE]
A bit off topic, but true nonetheless :p
D E A T H
2006-02-07, 11:31 AM #59
And the sad thing is, Quake 3's graphics still beat the pants off many modern games. :rolleyes:

Yeah, I'm serious.
"it is time to get a credit card to complete my financial independance" — Tibby, Aug. 2009
2006-02-07, 11:52 AM #60
[QUOTE=Dj Yoshi]Why wouldn't you see any?[/QUOTE]Because the limiting factor would be the CPU which is independent of the video card. As I mentioned above, rendering operations are conducted in parallel to the video card: only locking something in video memory causes the game's thread to block. The video card is allowed to keep rendering while the CPU starts crunching numbers for the next frame.

If games were CPU-bound the GPU would be constantly stalling, waiting for the CPU to tell it what to do. A good way to test this is to benchmark your system with AGP Fast Writes enabled and then disabled. If it's faster with AGP Fast Writes on, the system's bottleneck is either the GPU or the RAM.

Quote:
The second bit makes a bit of sense...I think. I'm not sure what shaders have to do with textures, but okay.
I'm not familiar with how Fear works because I've never played it, but Doom 3 is a known quantity.
The first pass on the scene is a Z-fill pass. The geometry is rendered but textures and shaders are disabled. This prevents overdraw. The 7800 GTX has, IIRC, 4 dedicated Z-fill pipelines to accelerate this process.
Each polygon in Doom 3 has a texture map, a normal map and a gloss map. On modern hardware this can be done in one pass.
The first textured pass is for ambient lighting.
Then you render the scene once for each light and combine it with the first pass using a stencil test against the shadow volumes.
I believe the average scene in Doom 3 had something like 3 or 4 lights, but I can't remember Doom 3 too well because it was a terrible game.

That's ~6 passes, even on the newest hardware. If the underlying scene gets more complex - say, if it uses 24 textures per polygon - the 7800 GT will have to render the scene 11 times while the 7800 GTX will still be able to do it in 6 passes. Fewer passes also mean fewer stencil tests, so in this theoretical future game the 7800 GTX will slaughter the 7800 GT.

Problem is, no game uses 24 textures per polygon. Or 16 fragment shaders per pixel. At least the 48 pixel shading units on the X1900 are used in parallel, accelerating overall scene rendering even when the scene doesn't call for 48 shaders in a single pixel.

Quote:
My computer is a bit CPU limited, but it shouldn't be. My Video memory shouldn't have poor latency--GDDR3 memory in an 8x AGP slot...don't see what could be limiting it (the AGP aperture only has problems with handling stuff like SLI, iirc.). I could overclock it to 450 when it was brand new and the airflow in my case was excellent...anymore though, only like 420mhz.
Not all GDDR3 memory is made equally. For instance, some X800 Pros were made with lower-latency GDDR3 memory so they could be overclocked to X800 XT speeds. Ultimately it's a limitation of the hardware: overclocking bad memory is like trying to turn a... excuse the car metaphor... Honda Civic into a Viper.
2006-02-07, 11:57 AM #61
Quote:
PS--Thanks for agreeing and proving my point Obi?


Did you read my post and the article? It was conclusive that in every game, unless you run 1024 x 768 no AA/AF, that a better GPU will help you most. In addition, when a game is CPU limited, the frames are so high it doesn't mater anyway.

Quote:
PSS, I spend all day browsing hardware news, forums, etc yet Jon still knows more about the scene than I will for a while. It's not about how much time you spend per day, it's about how long you've been doing it, how much you've gleaned from it, and how up-to-date you're kept. A perfect example is the recent hardware thread where SD_Rakishi said the 7800GT was well out of the pricerange of a 1000CAD PC. He would be right, at the release of the 7800GT, but now just 6 months later he's not. I only flame you because you're a little kid who thinks sarcasm gets his point across. Jon put it much more eloquently, and in a much better form than you ever would/could/will, and that's half of every argument/presentation.

This makes me somewhat of a hypocrite, but I think you get the point.


I was sarcastic because you flamed me about computer hardware. If you want to be eloquent that's not how. If I'm mad at someone I'm don't flame them, I just get sarcastic. But I did link to an article that explained the whole thing, with benchmarks.
2006-02-07, 1:28 PM #62
I have this:

AMD 64 3700+
Some Gigabyte motherboard
2gigs of ram
Ati Radeon x850xt 256mb pci-e
A bunch of other stuff you don't care about.

My power supply isn't powerful enough for this pc, so my cpu runs at 1005.01 mhz instead of 2.2. If I unplug all of my fans and such, it runs at like 1006.01 mhz.

New psu will be here in days. The funny part is, I can currently run Battlefield 2 at full details, 16x af 6x fsaa at 1280x1024 at about 50fps. I can't wait to see what happens when my pc is actually running at full power... and maybe a little bit more.
>>untie shoes
2006-02-07, 1:34 PM #63
Originally posted by Bill:
My power supply isn't powerful enough for this pc, so my cpu runs at 1005.01 mhz instead of 2.2. If I unplug all of my fans and such, it runs at like 1006.01 mhz.
Your multiplier is set incorrectly.
2006-02-07, 1:34 PM #64
Originally posted by Jon`C:
Because the limiting factor would be the CPU which is independent of the video card. As I mentioned above, rendering operations are conducted in parallel to the video card: only locking something in video memory causes the game's thread to block. The video card is allowed to keep rendering while the CPU starts crunching numbers for the next frame.

If games were CPU-bound the GPU would be constantly stalling, waiting for the CPU to tell it what to do. A good way to test this is to benchmark your system with AGP Fast Writes enabled and then disabled. If it's faster with AGP Fast Writes on, the system's bottleneck is either the GPU or the RAM.

I'm not familiar with how Fear works because I've never played it, but Doom 3 is a known quantity.
The first pass on the scene is a Z-fill pass. The geometry is rendered but textures and shaders are disabled. This prevents overdraw. The 7800 GTX has, IIRC, 4 dedicated Z-fill pipelines to accelerate this process.
Each polygon in Doom 3 has a texture map, a normal map and a gloss map. On modern hardware this can be done in one pass.
The first textured pass is for ambient lighting.
Then you render the scene once for each light and combine it with the first pass using a stencil test against the shadow volumes.
I believe the average scene in Doom 3 had something like 3 or 4 lights, but I can't remember Doom 3 too well because it was a terrible game.

That's ~6 passes, even on the newest hardware. If the underlying scene gets more complex - say, if it uses 24 textures per polygon - the 7800 GT will have to render the scene 11 times while the 7800 GTX will still be able to do it in 6 passes. Fewer passes also mean fewer stencil tests, so in this theoretical future game the 7800 GTX will slaughter the 7800 GT.

Problem is, no game uses 24 textures per polygon. Or 16 fragment shaders per pixel. At least the 48 pixel shading units on the X1900 are used in parallel, accelerating overall scene rendering even when the scene doesn't call for 48 shaders in a single pixel.

Not all GDDR3 memory is made equally. For instance, some X800 Pros were made with lower-latency GDDR3 memory so they could be overclocked to X800 XT speeds. Ultimately it's a limitation of the hardware: overclocking bad memory is like trying to turn a... excuse the car metaphor... Honda Civic into a Viper.

Thanks for clearing that up, but I'm pretty sure my VRAM is fine.

Originally posted by Obi_Kwiet:
Did you read my post and the article? It was conclusive that in every game, unless you run 1024 x 768 no AA/AF, that a better GPU will help you most. In addition, when a game is CPU limited, the frames are so high it doesn't mater anyway.

I was sarcastic because you flamed me about computer hardware. If you want to be eloquent that's not how. If I'm mad at someone I'm don't flame them, I just get sarcastic. But I did link to an article that explained the whole thing, with benchmarks.

And you see minimal gains at 1280x960/1024. Meaning that for almost all gamers, you're going to see more improvements by upgrading your processor than Video card, and you'd be surprised at low the framerates can be. I also wouldn't call a change of 60fps at 50-120 fps a point where the frames are "so high it doesn't matter anyways". Personally, I consider anything under 80 not to be butter smooth (assuming of course that you have an LCD screen or a high refresh CRT), and anything under 60 to be somewhat choppy, along with anything under 30 just plain unplayable.
D E A T H
2006-02-07, 1:35 PM #65
Originally posted by Jon`C:
Your multiplier is set incorrectly.

No it's not. I have increased the multiplier many times attempting to fix the situation. Any changes to fsb or multiplier are null. When I boot the computer, the cpu is running at the same speed. It doesn't have enough juice. I can also attest to this being the problem because the same thing happens if I try to oc my video card or ram.
>>untie shoes
2006-02-07, 1:45 PM #66
I feel so sorry for the kid with the Celeron. :(
2006-02-07, 2:31 PM #67
[QUOTE=Dj Yoshi]Personally, I consider anything under 80 not to be butter smooth (assuming of course that you have an LCD screen or a high refresh CRT), and anything under 60 to be somewhat choppy, along with anything under 30 just plain unplayable.[/QUOTE]
I think you would have shot yourself by now if you were playing on my computer. I run CS:Condition Zero's Deleted Scenes at 20 FPS flat. I consider it extremely playable.

Edit: :( This thread and my post makes me realize I really do need this new computer.
I had a blog. It sucked.
2006-02-07, 3:05 PM #68
Originally posted by Zloc_Vergo:
I think you would have shot yourself by now if you were playing on my computer. I run CS:Condition Zero's Deleted Scenes at 20 FPS flat. I consider it extremely playable.

Edit: :( This thread and my post makes me realize I really do need this new computer.


Ew, 20fps? I can't even aim properly at that sorta chopiness. 30fps is acceptable enough to be playable, 45fps is good mark to aim for, and if it holds 60fps, I couldn't be happier.

80fps is kinda exceeding, I can't really tell the difference between 60 and 80, myself.
.hack//SIGN - The World - Just Believe

(Yes, This is Cool Matty)
2006-02-07, 3:10 PM #69
I can play just FINE at 20. You guys must really be picky or something...granted, it can be better, but 20 is completely playable.

Alteast, Fraps had this big yellow 20 in the corner while I was playing..I assume this means 20 fps.

Although, at the jungle level, it got down to the lower teens and THAT was unplayable.
I had a blog. It sucked.
2006-02-07, 3:58 PM #70
Originally posted by Zloc_Vergo:
I can play just FINE at 20. You guys must really be picky or something...granted, it can be better, but 20 is completely playable.

Alteast, Fraps had this big yellow 20 in the corner while I was playing..I assume this means 20 fps.

Although, at the jungle level, it got down to the lower teens and THAT was unplayable.

You'll realize why we say 20 fps is unplayable when you get a better computer. You miss a lot of frames at 20fps in order to sync up with the server.
D E A T H
2006-02-07, 4:10 PM #71
[QUOTE=Dj Yoshi]You'll realize why we say 20 fps is unplayable when you get a better computer. You miss a lot of frames at 20fps in order to sync up with the server.[/QUOTE]
I don't play online. Maybe that's what I'm missing?
I had a blog. It sucked.
2006-02-07, 4:20 PM #72
Originally posted by Zloc_Vergo:
I don't play online. Maybe that's what I'm missing?


Probably. Playing online at less than 25-30fps is suicide. You'll never get the necessary headshots you need, and you won't have good enough reaction time, simply because the player will seem to skip, or not even be there before they're shooting. It's almost like hardware-induced lag.
.hack//SIGN - The World - Just Believe

(Yes, This is Cool Matty)
2006-02-07, 4:29 PM #73
Originally posted by Obi_Kwiet:
Just wait till I have my Opteron 165 running at 2.8 Ghz. How fast is NoESC's CPU?


AMD Athlon64 X2 4400+
2x 7800GTX in SLI
2x WD Raptor 74GB in RAID

other stuff that doesn't matter
gbk is 50 probably

MB IS FAT
2006-02-07, 8:26 PM #74
Originally posted by NoESC:
AMD Athlon64 X2 4400+
2x 7800GTX in SLI
2x WD Raptor 74GB in RAID

other stuff that doesn't matter


"other stuff that doesn't matter" == stolen credit card to buy said system

:p :D
2006-02-07, 8:29 PM #75
[QUOTE=Cool Matty]"other stuff that doesn't matter" == stolen credit card to buy said system

:p :D[/QUOTE]
Dude, the guy just about gave, GAVE me a Creative X-Fi for shipping and handling.

He's gotta be loaded. :(
D E A T H
2006-02-08, 8:13 AM #76
Originally posted by NoESC:
AMD Athlon64 X2 4400+
2x 7800GTX in SLI
2x WD Raptor 74GB in RAID

other stuff that doesn't matter



Whoa! Nice GPU set up! How's the 4400+ OC?
2006-02-08, 1:48 PM #77
Haven't felt the need to OC it yet ;)
gbk is 50 probably

MB IS FAT
12

↑ Up to the top!