Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → First Look at ATI's R600(?)
12
First Look at ATI's R600(?)
2007-01-03, 8:12 PM #41
[QUOTE=Dj Yoshi]"I buy nVidia because it's not ATI"

...

That's like saying "I buy ford because it's not GM". Seriously, what's the real reason? What's wrong with ATI?[/QUOTE]


Unlpleasant experiences, asshead technical support, angry customer. And it didn't just happen once.
2007-01-03, 8:29 PM #42
Originally posted by Forsakahn:
Just a bit over a year (i got a 6600 last Christmas). Can't say I've heard much about the ATi drivers since, or noticed any real problems with nVidia (apart from that annoying SLi marketing ploy last summer).



You... are out of touch with reality. :eek:
2007-01-03, 8:39 PM #43
Originally posted by drizzt2k2:
Where do they get these stupid names? Cant it just be Radeon 2007? They should go by year, so everyone knows which is kind of up to date by the name.

The R600 is the prototype name--it's the codename for the graphics card until they release an actual name. It's not the actual name.

Originally posted by Rob:
Unlpleasant experiences, asshead technical support, angry customer. And it didn't just happen once.

1) Unpleasant experiences?
2) Technical support is on the head of the manufacturer, not the gpu company. ATI is not responsible for what Sapphire does technical support wise, and neither is nVidia responsible for what XFX does. You're blaming the wrong people.
3) It might not have happened once, but what card, what situation, and what else were the circumstances?

Something tells me you're wrongly placing the blame, which is silly.

Forsakhan--do you actually know anything about graphics cards? SLi isn't a marketing ploy.
D E A T H
2007-01-03, 9:19 PM #44
I don't care what you think really.

I know I'm being bull-headed.

I am not going to use an ATI product EVER AGAIN if I can help it.


The end.
2007-01-03, 10:06 PM #45
Originally posted by Rob:
I don't care what you think really.

I know I'm being bull-headed.

I am not going to use an ATI product EVER AGAIN if I can help it.


The end.

Stupid is the term. It's not "bull-headed". Stupid. You don't realize ATI doesn't make their own products (save for the GPU and a few other minor components) itself. Other producers make the products.

So stupid. Ignorant. Or just a mixture of both.
D E A T H
2007-01-03, 10:08 PM #46
Who cares, let him by nVidia.

I like ATI because it starts with A.
2007-01-03, 10:10 PM #47
You are a very intelligent consumer.
SnailIracing:n(500tpostshpereline)pants
-----------------------------@%
2007-01-03, 10:11 PM #48
I only buy A brands when I can help it.

Asus
AMD
ATI

wooo
2007-01-03, 10:29 PM #49
[QUOTE=Vincent Valentine]Who cares, let him by nVidia.

I like ATI because it starts with A.[/QUOTE]
I try to keep people educated as best as possible, and by making an example of stupidity, hopefully someone won't make the same mistake.

Truth is, ATI has had the most powerful cards in each generation so far, even the X800 and X18/900 were better than their counterparts (although the X800 not by much, and on a game by game basis because of lack of SM3.0 capability). They've been mopping up for a while, and people don't buy ATI because of blind nVidia fanboyism, and not just a couple hundred of stupid consumers, but thousands, if not tens/hundreds of thousands. Even if only one person goes "Hey, maybe I should look into ATI again" after my post, I'd be happy. Even if that person isn't Rob.
D E A T H
2007-01-03, 10:30 PM #50
Hey, maybe I should look into ATI again.
2007-01-04, 9:23 AM #51
[QUOTE=Dj Yoshi]stuff[/QUOTE]
I agree, and have always preferred ATI cards, but it's nVidia's better Linux driver support that wins for me, in the end.
Naked Feet are Happy Feet
:omgkroko:
2007-01-04, 11:14 AM #52
Always, for some odd stroke of fate, have used ATi vid cards even though i haven't bought a computer in years.

The school computers, my computer, and all of my friends have ATi cards.
Rather odd that.
2007-01-04, 11:46 AM #53
[QUOTE=Dj Yoshi]Stupid is the term. It's not "bull-headed". Stupid. You don't realize ATI doesn't make their own products (save for the GPU and a few other minor components) itself. Other producers make the products.

So stupid. Ignorant. Or just a mixture of both.[/QUOTE]


They write up the drivers, which aside from HARWARE FAILURE were a big part of the problem.
2007-01-04, 11:56 AM #54
Originally posted by Tiberium_Empire:
Always, for some odd stroke of fate, have used ATi vid cards even though i haven't bought a computer in years.

The school computers, my computer, and all of my friends have ATi cards.
Rather odd that.

I'll wait for Rob, Jepman, or Echoman to say it better...
>>untie shoes
2007-01-04, 12:08 PM #55
Originally posted by Rob:
They write up the drivers, which aside from HARWARE FAILURE were a big part of the problem.

It's either the drivers or the hardware. It's not both. And their drivers are fine.

So still stupid.
D E A T H
2007-01-04, 12:09 PM #56
I of course miss this because I hadn't bothered to log into Massassi for a while. This happens when you try to keep accounts on several dozen forums... :(

Anyhow, that site, Level505, is a fake; that article is pretty much the only thing up there. And I'm a bit disappointed that no on here yet thought to do (or at least, to post that they did) a Whois search on the domain. It was registered on December 28th, and the registrant was careful to conceal every bit of information they could.

In other words, the site is a fake. Coincidentally, I don’t actually think that R600 is weaker than G80, but in fact the opposite; given that it appears that R600 will boast GDDR4 memory, higher core clock speeds, GDDR4 memory for higher memory clock speeds, and a 512-bit REAL memory interface (that “ringbus” would then be at 1024-bit) I can’t help but think that there would be a substantial performance gap, more like 35-50%, rather than the 10-20% suggested by the article. It can happen, after all… Anyone remember the GeForce 4 Ti?

[QUOTE=Dj Yoshi]What's wrong with ATI?[/QUOTE]
Just as a guess, perhaps your support of them. :v:
Wake up, George Lucas... The Matrix has you...
2007-01-04, 12:14 PM #57
Originally posted by nottheking:
I of course miss this because I hadn't bothered to log into Massassi for a while. This happens when you try to keep accounts on several dozen forums... :(

Anyhow, that site, Level505, is a fake; that article is pretty much the only thing up there. And I'm a bit disappointed that no on here yet thought to do (or at least, to post that they did) a Whois search on the domain. It was registered on December 28th, and the registrant was careful to conceal every bit of information they could.

In other words, the site is a fake. Coincidentally, I don’t actually think that R600 is weaker than G80, but in fact the opposite; given that it appears that R600 will boast GDDR4 memory, higher core clock speeds, GDDR4 memory for higher memory clock speeds, and a 512-bit REAL memory interface (that “ringbus” would then be at 1024-bit) I can’t help but think that there would be a substantial performance gap, more like 35-50%, rather than the 10-20% suggested by the article. It can happen, after all… Anyone remember the GeForce 4 Ti?


Just as a guess, perhaps your support of them. :V

Probably. Wouldn't surprise me. I'm fairly sure I could go "I'm Christian" and half the christians here would denounce god as the cocksucking, *******ing lame ***** he is.

Kinda lame that it's a fake, but the numbers kind of lined up with what I was thinking the R600 would be. We'll see, but I expect it to kick *** and take names.
D E A T H
2007-01-04, 12:47 PM #58
[QUOTE=Dj Yoshi]Kinda lame that it's a fake, but the numbers kind of lined up with what I was thinking the R600 would be. We'll see, but I expect it to kick *** and take names.[/QUOTE]
Oh, I'm rather certain it will; a good side to this is that it also means that nVidia will have to slash the price of the GeForce 8800 to compete, (as well as actually consider getting GeForce 8600 out the door) which means it will be good for all. That site seemed to under-state it, suggesting it would first ship with GDDR3. Most of the specs they posted for it more of resembled the GeForce 8800GTX anyway.
Wake up, George Lucas... The Matrix has you...
2007-01-04, 12:54 PM #59
(Side note: Yoshi, Rob, I realize you two are probably the least likely to care that either of you are insulting each other, however, tone it down anyways.)
omnia mea mecum porto
2007-01-04, 12:58 PM #60
He wasn't insulting me.


He said my reason for not liking ATI is stupid. It is. I don't care, it totally isn't an insult or a big deal.


Maybe we should just remove all adjectives that have possible negative conontations?
2007-01-04, 1:10 PM #61
I wasn't saying he was insulting you, I said you guys are probably the least likely to care if it happened. I still don't want to leave for the day and come back to see things elevating to that level. That's all.
omnia mea mecum porto
2007-01-04, 1:13 PM #62
[QUOTE=Vincent Valentine]Who cares, let him by nVidia.

I like ATI because it starts with A.[/QUOTE]

<3
2007-01-04, 1:15 PM #63
Originally posted by Roach:
I wasn't saying he was insulting you, I said you guys are probably the least likely to care if it happened. I still don't want to leave for the day and come back to see things elevating to that level. That's all.



Meh.

I might disagree with Yoshi, and we might have heated arguments over dumb topics on a really consistent rate.. But I don't think either of us hold a beef. We get along fine every other time. I don't think I've ever met to PERSONALLY insult him, and I don't think he has ever to personally insult me. Although to many people it may look like the other way around.
2007-01-04, 6:40 PM #64
[QUOTE=Dj Yoshi]Forsakhan--do you actually know anything about graphics cards? SLi isn't a marketing ploy.[/QUOTE]

I'm not talking about the technology itself (personally, i wish that i could use SLi on my machine (outdated Intel board and no money to replace it. enough said)). What i'm talking about is the driver "error" that lasted for a couple of months back last summer which basically told anyone that didn't have SLi running (for any reason) that their SLi was disabled and then proceeded to tell that how to fix it (ie, by giving them a link to a site that listed nVidia's partners and where to buy their parts).

Apart from that, i haven't had any complaints.
50000 episodes of badmouthing and screaming like a constipated goat cant be wrong. - Mr. Stafford
2007-01-04, 7:01 PM #65
Originally posted by Rob:
I just buy Nvidia because it's not ATI.
If you are thinking of buying a DX10 card do not buy one from the 8000-series of NVIDIA cards. The R600 is based on the Unified Shader Model technologies ATI developed for the Xbox 360 - it is a "true" DX10 card. The new GeForce isn't - it emulates a Unifed Shader Model using a software driver. It's a damn fast DX9 card that crushes everything else on the shelves right now, but if you're buying it for DX10 purposes you are going to be very disappointed with it.

If you insist upon buying an NVIDIA card you should wait for their second-generation DX10 offering. This one isn't worth the money.
2007-01-04, 7:02 PM #66
Originally posted by Forsakahn:
I'm not talking about the technology itself (personally, i wish that i could use SLi on my machine (outdated Intel board and no money to replace it. enough said). What i'm talking about is the driver "error" that lasted for a couple of months back last summer which basically told anyone that didn't have SLi running (for any reason) that their SLi was disabled and then proceeded to tell that how to fix it (ie, by giving them a link to a site that listed nVidia's partners and where to buy their parts).

Apart from that, i haven't had any complaints.

Error: Unmatched '(' on line 1 :psyduck:
2007-01-04, 7:08 PM #67
Originally posted by Jon`C:
If you are thinking of buying a DX10 card do not buy one from the 8000-series of NVIDIA cards. The R600 is based on the Unified Shader Model technologies ATI developed for the Xbox 360 - it is a "true" DX10 card. The new GeForce isn't - it emulates a Unifed Shader Model using a software driver. It's a damn fast DX9 card that crushes everything else on the shelves right now, but if you're buying it for DX10 purposes you are going to be very disappointed with it.

If you insist upon buying an NVIDIA card you should wait for their second-generation DX10 offering. This one isn't worth the money.

Just curious, do you have any reference for this? Everything I've read so far has pointed towards the G80 adhering to the unified shader model. At the very least, I can't find anything saying anything about it.
D E A T H
2007-01-04, 7:22 PM #68
[QUOTE=Dj Yoshi]Just curious, do you have any reference for this? Everything I've read so far has pointed towards the G80 adhering to the unified shader model. At the very least, I can't find anything saying anything about it.[/QUOTE]
http://www.theinquirer.net/default.aspx?article=32769
2007-01-04, 7:25 PM #69
Although I like nVidia more, I have an ATI card because it was a good card at a good price.
2007-01-04, 7:35 PM #70

No offense, but The Inquirer is a bit...less than trustworthy to me considering their exhibition with the Revolution's "Confirmed Specs" a while back. I checked HardOCP and Anandtech's reviews of the 8800 and they both say it has the Unified Architecture that DX10 outlines.
D E A T H
2007-01-04, 7:41 PM #71
[QUOTE=Dj Yoshi]No offense, but The Inquirer is a bit...less than trustworthy to me considering their exhibition with the Revolution's "Confirmed Specs" a while back. I checked HardOCP and Anandtech's reviews of the 8800 and they both say it has the Unified Architecture that DX10 outlines.[/QUOTE]Hahaha this made mb's post earlier in the thread gold.

DX10 specification allows for software emulation of a unified shader model. USM is a 'hardware' thing - in a software sense it's divided up into three different kinds of shaders (vertex, pixel and geometry), and geometry can be emulated on the pixel shader. It follows the DX10 spec, but only because Nvidia lobbied them to change it.
2007-01-04, 7:44 PM #72
Originally posted by Jon`C:
Hahaha this made mb's post earlier in the thread gold.

Why, because I checked with the two biggest hardware review sites on the internet with the most clout and came up with something different from a place that's less than trustworthy? Seriously, if you're gonna link somewhere, link somewhere that doesn't have a (recent) history of lying. Also, hardocp's forums are where the idiots are--the hardware reviews are perfectly fine.

Originally posted by Jon`C:
DX10 specification allows for software emulation of a unified shader model. USM is a 'hardware' thing - in a software sense it's divided up into three different kinds of shaders (vertex, pixel and geometry), and geometry can be emulated on the pixel shader. It follows the DX10 spec, but only because Nvidia lobbied them to change it.

So basically, pixel shaders can do geometry and pixel, but not vertex?
D E A T H
2007-01-04, 7:51 PM #73
[QUOTE=Dj Yoshi]So basically, pixel shaders can do geometry and pixel, but not vertex?[/QUOTE]Not exactly, but from what I gathered (during the last 6 months) that's how the NVIDIA card works. In fact this whole "G80 has a Unified Shader Model!" thing is the first I've heard of it. Like I said, they even talked Microsoft into relaxing the DX10 specification and allowing for cards that don't have a Unified Shader Model.

I'll be honest and say that I don't know exactly how the emulation works, but ATI had a demonstration of DirectX 10 hardware being emulated on a X1950's pixel shader at the last GDC. It's easily possible but it's something you wouldn't be able to do under, say, vanilla DX9 without custom drivers.
2007-01-04, 7:55 PM #74
Originally posted by Jon`C:
Not exactly, but from what I gathered (during the last 6 months) that's how the NVIDIA card works. In fact this whole "G80 has a Unified Shader Model!" thing is the first I've heard of it. Like I said, they even talked Microsoft into relaxing the DX10 specification and allowing for cards that don't have a Unified Shader Model.

I'll be honest and say that I don't know exactly how the emulation works, but ATI had a demonstration of DirectX 10 hardware being emulated on a X1950's pixel shader at the last GDC. It's easily possible but it's something you wouldn't be able to do under, say, vanilla DX9 without custom drivers.

I see. That makes the market a whole lot more interesting--you don't NEED a DX10 card to do DX10. Not that it's overall advisable as I'm sure there's gonna be quite the performance hit (as with ATI and SM3.0 with the X800). I had heard they relaxed the restrictions, but I didn't know how or why.

Makes sense though, that's why I've never heard of the G80 not being fully DX10 compatible after its release, but remember seeing something about it before release. Up until now I was willing to take your word on it because of that, but after reading a lot of reviews I couldn't come up with anything solid. Thanks, though, no offense meant.
D E A T H
2007-01-04, 8:03 PM #75
I doubt ATI or NVIDIA will release DX10 emulation drivers for cards that aren't selling at DX10 prices.
2007-01-04, 8:07 PM #76
Originally posted by Jon`C:
I doubt ATI or NVIDIA will release DX10 emulation drivers for cards that aren't selling at DX10 prices.

True... I might see ATI doing it, but not nVidia.
D E A T H
2007-01-04, 8:17 PM #77
Originally posted by Jon`C:
If you are thinking of buying a DX10 card do not buy one from the 8000-series of NVIDIA cards. The R600 is based on the Unified Shader Model technologies ATI developed for the Xbox 360 - it is a "true" DX10 card. The new GeForce isn't - it emulates a Unifed Shader Model using a software driver. It's a damn fast DX9 card that crushes everything else on the shelves right now, but if you're buying it for DX10 purposes you are going to be very disappointed with it.

If you insist upon buying an NVIDIA card you should wait for their second-generation DX10 offering. This one isn't worth the money.



Thanks Jon.
12

↑ Up to the top!