Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → Next-gen consoles
12
Next-gen consoles
2008-05-15, 6:12 PM #41
Originally posted by Baconfish:
Actually I have, and I'm sure its on youtube to clarify.

:P

Well, in any case, we're not talking about one console so the plural would be correct.
D E A T H
2008-05-15, 6:17 PM #42
I was more getting at the other way you can read it wouldn't make sense with the s.

:P
nope.
2008-05-15, 6:42 PM #43
Originally posted by Baconfish:
I was more getting at the other way you can read it wouldn't make sense with the s.

:P

If it had multiple cover art it would be appropriate.
D E A T H
2008-05-15, 7:22 PM #44
Two GB of RAM for non-craptacualar textures.

Probably not as big a difference as that last consoles, though, due the slump in the GPU industry. Either that or it'll be a really long time. Probably the latter.
2008-05-15, 9:13 PM #45
...the 'slump,' as you put it, is not restricted to graphics card makers and also doesn't really exist. Graphics card performance is improving geometrically each generation but it's in a way that's difficult to observe unless you're playing a game that actually makes use of those features. This is a large part of the reason we're starting to see physics tasks and general-purpose computation offloaded onto the GPU. The same thing applies to CPUs, where every new generation of Intel processor doubles the number of available cores on a single processor.

Unfortunately making use of the improved performance with modern tools requires a great deal of additional work. Seriously though, they are a lot faster.
2008-05-15, 9:28 PM #46
Indeed, However i dont understand why people say Crysis looks so good, For me textures turn to crap 6 feet away. On an 8800 GT.
2008-05-16, 9:32 AM #47
One thing I expect to see a lot more of:
A bigger focus on the environment than the graphics. Diversity in character models, being able to interact with every little thing around you, more destructible environments, etc.
2008-05-16, 11:27 AM #48
Originally posted by Tiberium_Empire:
Indeed, However i dont understand why people say Crysis looks so good, For me textures turn to crap 6 feet away. On an 8800 GT.

You agree with Joncy, acting like you know what he's talking about, then turn around and state something that's, quite obviously, an LoD issue.

Laffo.
D E A T H
2008-05-16, 12:20 PM #49
Aglar, that's why I have high hopes for id Tech 5. They're focusing a lot more on content authoring tools and procedural generation.
2008-05-16, 12:27 PM #50
next-gen's need more mod support and free downloads rather than charging people $10 for 4 maps and all the other bull****
"Nulla tenaci invia est via"
2008-05-16, 12:30 PM #51
Originally posted by Jon`C:
Aglar, that's why I have high hopes for id Tech 5. They're focusing a lot more on content authoring tools and procedural generation.

I hadn't heard about this yet, but now I'm pretty damned excited.
2008-05-16, 12:37 PM #52
Originally posted by Jon`C:
...the 'slump,' as you put it, is not restricted to graphics card makers and also doesn't really exist. Graphics card performance is improving geometrically each generation but it's in a way that's difficult to observe unless you're playing a game that actually makes use of those features. This is a large part of the reason we're starting to see physics tasks and general-purpose computation offloaded onto the GPU. The same thing applies to CPUs, where every new generation of Intel processor doubles the number of available cores on a single processor.
.


It's not so much the generation to generation performance that's the issue. Generation's are getting farther apart due to troubles with the industry. June will be the first time since Nov. of '06 since we'll have seen anything new.
2008-05-16, 3:38 PM #53
Originally posted by Aglar:
One thing I expect to see a lot more of:
A bigger focus on the environment than the graphics. Diversity in character models, being able to interact with every little thing around you, more destructible environments, etc.

Battlefield: Bad Company looks good for this.

Originally posted by Obi_Kwiet:
It's not so much the generation to generation performance that's the issue. Generation's are getting farther apart due to troubles with the industry. June will be the first time since Nov. of '06 since we'll have seen anything new.
Yeah, except for the, you know, 9800GTX.
D E A T H
2008-05-16, 3:46 PM #54
...which is no better performance wise than the 8800GTX (and in some circumstances worse. The 8800gtx is still better for crysis.)
TheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkWho
SaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTh
eJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSa
ysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJ
k
WhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSays
N
iTheJkWhoSaysNiTheJkWhoSaysNiTheJkWhoSaysNiTheJkW
2008-05-16, 3:55 PM #55
Originally posted by TheJkWhoSaysNi:
...which is no better performance wise than the 8800GTX (and in some circumstances worse. The 8800gtx is still better for crysis.)

Then look at the 9800GX2 *shrug* either way new is new.
D E A T H
2008-05-16, 4:18 PM #56
Originally posted by TheJkWhoSaysNi:
...which is no better performance wise than the 8800GTX (and in some circumstances worse. The 8800gtx is still better for crysis.)


I would like to refer to you my post above because
2008-05-16, 4:21 PM #57
Originally posted by Dj Yoshi:
Then look at the 9800GX2 *shrug* either way new is new.


That's still not new. That's two tweaked, slightly less powerful G92s bolted together. They could have essentially released that along side the 8800GTX in 2006 if they chose to.

The G92 is a very minor core tweak. It's less significant than the 7800-7900 refresh.
2008-05-16, 4:23 PM #58
Originally posted by Obi_Kwiet:
That's still not new. That's two tweaked, slightly less powerful G92s bolted together. They could have essentially released that along side the 8800GTX in 2006 if they chose to.

The G92 is a very minor core tweak. It's less significant than the 7800-7900 refresh.

Tell that to the new 8800GTS which destroys the old one.
D E A T H
2008-05-16, 4:44 PM #59
Hey guys, remember the Voodoo 2? Remember how it was seriously two unmodified overclocked Voodoo 1 TMUs hooked together on the same card? Remember how a Voodoo 2 only had marginally better fillrate than a Voodoo 1 unless the developer used multitexturing extensions?

Remember how fragment and vertex shaders are unable to access the properties of any other fragment or vertex, making parallel rendering trivial and making "two 8800s glued together" a really smart way of doing something if you already have a design that works and scales up really well?

I mean, what do you expect? Look at the CPU market. Pentium 4 was Intel's newest from-scratch architecture and it was dropped in favor of Core 2, whose lineage can be traced back to the Pentium-frickin'-2. You're going to be seeing more and more of this now that ATI and NVIDIA have broken down their shaders into tightly optimized programmable vector units. Your performance in a game that exists right now will probably never get much better.
2008-05-16, 5:23 PM #60
I remember the first graphics card upgrade I ever bought was a voodoo 3 16mb.. hardware rendering in JK was so SWEET
"Nulla tenaci invia est via"
2008-05-16, 8:05 PM #61
Originally posted by Dj Yoshi:
Tell that to the new 8800GTS which destroys the old one.


:psyduck:

The 8800GTS 512 was a totally different configuration than the 640. Just because they were able to tweak the core so they could sell a higher speced version of it at a lower price doesn't mean it was a significant change in technology. The fact that they didn't release anything new meant that they were able to tweak their core for maximum performance for the yield.



Quote:
Hey guys, remember the Voodoo 2? Remember how it was seriously two unmodified overclocked Voodoo 1 TMUs hooked together on the same card? Remember how a Voodoo 2 only had marginally better fillrate than a Voodoo 1 unless the developer used multitexturing extensions?

Remember how fragment and vertex shaders are unable to access the properties of any other fragment or vertex, making parallel rendering trivial and making "two 8800s glued together" a really smart way of doing something if you already have a design that works and scales up really well?

I mean, what do you expect? Look at the CPU market. Pentium 4 was Intel's newest from-scratch architecture and it was dropped in favor of Core 2, whose lineage can be traced back to the Pentium-frickin'-2. You're going to be seeing more and more of this now that ATI and NVIDIA have broken down their shaders into tightly optimized programmable vector units. Your performance in a game that exists right now will probably never get much better.


It's still nothing new. If they wanted to, they could have tied 64 G70s together back in '05 and gotten awesome performance. For yield and cost reasons alone it makes sense that we should move away from one giant uber GPU to more smaller ones, but we need to move toward more GPUs integrated onto one card, sharing memory.

For CPUs, the future is more cores. If AMD or Intel had just started making two or four socket mobos and said, "Progress lolz!" they would have been laughed out of town. (Actualy AMD did, and they were.)

The point is, what ever direction we go in, we need to continue to innovate in order to make feasible progress. Putting a PCI-E splitter on two cards will not take the place of real innovation, and is not the same thing as real progress. Devising a way to put more GPUs on one PCB sharing VRAM, well that's a whole other story.

The bottom line is, innovation has been slowed down by lack of competition from ATI, hence the slump. How long did the 8800GTS cost 300$? A heck of a lot longer than if ATI had had something competitive.
2008-05-16, 10:15 PM #62
Obi, ...sigh.

NVIDIA cannot tie together 12 8800 GTXes and get the same performance because trace length becomes the bottleneck. SLI works by duplicating driver calls across multiple cards with different viewports. The new cards don't work like that. You're oversimplifying things to a preposterous degree.
I'll agree that the complete lack of competition has slowed down the graphics card market but the reason the 8800 GTX still isn't being fully-utilized is because this generation is a transitional one.

Again: the market is moving, cards are getting faster at the same rate they always have, but you will not see these improvements until developers put DirectX 9 to bed.
2008-05-16, 10:18 PM #63
Originally posted by Jon`C:
NVIDIA cannot tie together 12 8800 GTXes and get the same performance because trace length becomes the bottleneck.

Jon, you can't expect to win the argument when Obi has no idea what trace length is.
Bassoon, n. A brazen instrument into which a fool blows out his brains.
2008-05-17, 8:40 AM #64
Originally posted by Jon`C:
Obi, ...sigh.

NVIDIA cannot tie together 12 8800 GTXes and get the same performance because trace length becomes the bottleneck. SLI works by duplicating driver calls across multiple cards with different viewports. The new cards don't work like that. You're oversimplifying things to a preposterous degree.
I'll agree that the complete lack of competition has slowed down the graphics card market but the reason the 8800 GTX still isn't being fully-utilized is because this generation is a transitional one.

Again: the market is moving, cards are getting faster at the same rate they always have, but you will not see these improvements until developers put DirectX 9 to bed.


Clearly there are going to be issues with SLI once you put too many cards on there. That's not at all the point. And I'll also agree that current gen. cards are not being taken full advantage of. In fact, that's probably another reason that the industry is moving slowly right now.

I'm not at all complaining about the generation to generation performance difference between the G70 and G80. I'm saying that the time in between major architecture changes is going down due to a slacking off of the competition between Nvidia and ATI that was previously so heated. My point is, since the G80, we haven't really had anything new, except the inevitable optimizations of the core, which are pretty minor, all things considered.

You keep getting so caught up in irrelevant technicalities, that I have no idea what point you're trying to make.
2008-05-17, 11:50 AM #65
Um... if you actually understood the technicalities you might figure out that they're not irrelevant?

Like your comment about SLI, for instance. SLI has a nearly linear performance gain with respect to the number of cards. I have already said why this is possible. I have also stated that it's implementationally-costly, which is one of the reasons why that's not how the new cards work (even if it seems that way to you).
2008-05-17, 12:19 PM #66
this sounds like a debate between marketing team vs developper...
"NAILFACE" - spe
2008-05-17, 12:26 PM #67
First, stop being so vague about "new cards". Are you talking about the G80 generation? G92? What?

Also, as far as I can tell, you think I'm making some kind of point about SLI, what I'm not sure. I was exaggerating to make a point. For now forget I said anything about SLI.

You're just stating a bunch of factually correct things which are addressing point that I never made. Look that's make this easy.

Quote:
I'll agree that the complete lack of competition has slowed down the graphics card market but the reason the 8800 GTX still isn't being fully-utilized is because this generation is a transitional one.


Ok here, you agree with the only point I'm trying to make . I don't know where you you get things about me complaining about the lack of utilization of modern GPUs, since you initially brought that up.

I make a comment about how we haven't had significant new GPU technology since and you go off at me making a bunch of factually correct arguments about SLI which aren't tied to any kind of overall reasoning at all. I don't know what your deal is today, you usually do a good job of looking at what's being said and responding in kind, but today you won't even tell me what point you're trying to make.
2008-05-17, 2:05 PM #68
Obi, what I'm saying is that GPUs are getting better at the same rate that they always have, even though the market has slowed down. I'm also stating that the evolutionary redesigning of a product is how things were always done and how things must be done when you have a 300 million transistor beast that cost $2 billion in R&D.

Again: the new designs (by which I mean any DX10 card) scale up much better than previous generations. If a 3D application is perfectly GPU-bound, a card with twice as many shader units and twice the bandwidth will pretty much run the 3D application twice as fast, whether you're talking about putting two cores on a single card or doing a SLI abomination like the GX2. You're looking at a much greater jump in performance over that of, say, going from the GeForce3 Ti500 to a GeForce4 Ti4600, even compensating for release times.

What I've been saying is that you've been forming incorrect assumptions based on an incorrect interpretation of the facts. Everything has been relevant.
2008-05-17, 3:26 PM #69
Originally posted by Obi_Kwiet:
:psyduck:

The 8800GTS 512 was a totally different configuration than the 640. Just because they were able to tweak the core so they could sell a higher speced version of it at a lower price doesn't mean it was a significant change in technology. The fact that they didn't release anything new meant that they were able to tweak their core for maximum performance for the yield.

You do know the 8800GTS 512 was a G92 instead of a G90 right?

As for the rest of the idiocy you've been posting, listen to Jon. He knows what he's talking about. The technicalities aren't irrelevant. I'm not going to pretend I understand every word that he's saying, but it's not hard to get the gist of it, especially if you wiki a few of the terms he uses.
D E A T H
2008-05-17, 5:11 PM #70
Originally posted by Dj Yoshi:
You do know the 8800GTS 512 was a G92 instead of a G90 right?

And you do know that there is no G90, right? And that the G92 is a tweaked G80, right?

Quote:
As for the rest of the idiocy you've been posting, listen to Jon. He knows what he's talking about.


He agrees with the point I was making (basically).


Quote:
Obi, what I'm saying is that GPUs are getting better at the same rate that they always have, even though the market has slowed down. I'm also stating that the evolutionary redesigning of a product is how things were always done and how things must be done when you have a 300 million transistor beast that cost $2 billion in R&D.


Probably true, but I'm not sure you can say that definitively until the tardy next-gen cards come out in June. I guess I should have said that due to industry issues, it is possible that card development could progress at a more slow rate than it would if competition were more heated.

Quote:
Again: the new designs (by which I mean any DX10 card) scale up much better than previous generations. If a 3D application is perfectly GPU-bound, a card with twice as many shader units and twice the bandwidth will pretty much run the 3D application twice as fast, whether you're talking about putting two cores on a single card or doing a SLI abomination like the GX2. You're looking at a much greater jump in performance over that of, say, going from the GeForce3 Ti500 to a GeForce4 Ti4600, even compensating for release times.


Oh definitely. I agree on all counts. The G80 is awesome, and the R600 definitely has a lot of potential that will probably never be tapped. Forget what I said about 32 G70s. I was just making the point that the GX2 is nothing new technology-wise; even though it does scale better than the 7900GX2, that's a function of the G80/92 core itself rather than the GX2.

Quote:
What I've been saying is that you've been forming incorrect assumptions based on an incorrect interpretation of the facts. Everything has been relevant.


Yeah, expect the stuff about SLI.
12

↑ Up to the top!