Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → Cray Supercomputers
Cray Supercomputers
2004-10-05, 8:54 PM #1
The new Cray

In other news, Epic Games stocks went up, as projections for Unreal Engine 3 game sales soared.
D E A T H
2004-10-05, 9:39 PM #2
I wonder how doom3 or HL2 would run on that thing. you could run the linux build of doom3 on it-- you probably wouldn't even need a graphics card)

What is the exact speed?
2004-10-06, 5:08 AM #3
Quote:
Originally posted by Pagewizard_YKS
What is the exact speed?

58gflops for a single chassis, 691gflops for one rack.
And when the moment is right, I'm gonna fly a kite.
2004-10-06, 11:07 AM #4
58 GFlops for a single unit? Impressive. By comparison, here's the power of the CPUs in modern consoles:
  • Intel Celeron 733Mhz (Microsoft XBox): 550 Megaflops
  • Toshiba/Sony Emotion Engine 300Mhz (sony PlayStation2): 6.2 Gigaflops
  • IBM/Nintendo Gecko 485Mhz (Nintendo GameCube): 8.6 Gigaflops

Please note that, like conventional PCs, the XBox hardly uses the CPU for any floating-point operations for 3D rendering, instead relying entirely on the GeForce 2. By comparison, both the PS2 and GC rely fairly heavily on the CPU for floating point operations, though their GPU is still far better at them.

That said, though, a single XD1 unit would be able to play Doom3, HL2, or even U3 at maxed everything, and get framerates into the hundreds. By comparison, my AMD Athlon64 2800+ performs at arround 5 Gigaflops.
Wake up, George Lucas... The Matrix has you...
2004-10-06, 11:11 AM #5
what is a gflop?
2004-10-06, 11:16 AM #6
Quote:
Originally posted by Pagewizard_YKS
what is a gflop?

Giga-flop, or billions of FLOating-Point operations per second. This represents a computer's capacity to operate with numbers where exact precision can be limited to any given ammount of digits (such as recording 5 million as a 1-digit number).

It is this floating point capability that allows computers to not crash when presented with a problem like 1/3., as flosting point numbers can be used to represent infinity.
Wake up, George Lucas... The Matrix has you...
2004-10-06, 3:42 PM #7
Quote:
Originally posted by nottheking
Please note that, like conventional PCs, the XBox hardly uses the CPU for any floating-point operations for 3D rendering, instead relying entirely on the GeForce 2. By comparison, both the PS2 and GC rely fairly heavily on the CPU for floating point operations, though their GPU is still far better at them.

That, and the fact that the xbox runs a cheezy x86 chip, while the PS2 runs a MIPS RISC chip.


RISC is good. ;)
And when the moment is right, I'm gonna fly a kite.
2004-10-06, 3:50 PM #8
Quote:
Originally posted by Dj Yoshi
The new Cray

In other news, Epic Games stocks went up, as projections for Unreal Engine 3 game sales soared.


Is that true because I think someone cracked a joke about Cray Computer Sales going up in the Unreal 3 thread, and if your reusing the joke....
2004-10-06, 4:03 PM #9
Quote:
Originally posted by nottheking
58 GFlops for a single unit? Impressive. By comparison, here's the power of the CPUs in modern consoles:
  • Intel Celeron 733Mhz (Microsoft XBox): 550 Megaflops
  • Toshiba/Sony Emotion Engine 300Mhz (sony PlayStation2): 6.2 Gigaflops
  • IBM/Nintendo Gecko 485Mhz (Nintendo GameCube): 8.6 Gigaflops

Please note that, like conventional PCs, the XBox hardly uses the CPU for any floating-point operations for 3D rendering, instead relying entirely on the GeForce 2. By comparison, both the PS2 and GC rely fairly heavily on the CPU for floating point operations, though their GPU is still far better at them.

That said, though, a single XD1 unit would be able to play Doom3, HL2, or even U3 at maxed everything, and get framerates into the hundreds. By comparison, my AMD Athlon64 2800+ performs at arround 5 Gigaflops. [/B]


Just nit picking, but the Xbox uses a "specially designed" GeForce 3 64MB.
Got a permanent feather in my cap;
Got a stretch to my stride;
a stroll to my step;
2004-10-06, 4:04 PM #10
i was gonna say something to but o well lol, maybe he just accidently hit 2 instead of 3...?
2004-10-06, 4:06 PM #11
I was 'reusing' the joke.
D E A T H
2004-10-06, 5:34 PM #12
Quote:
Originally posted by Snoopfighter639
Is that true because I think someone cracked a joke about Cray Computer Sales going up in the Unreal 3 thread, and if your reusing the joke....


Mwuahaha, that was my joke. However, it was itself inspired by a comment in my even earlier thread about the new IBM supercomputers, in which somebody asked "But will it play HL2?"
Stuff
2004-10-06, 7:17 PM #13
Quote:
Originally posted by kyle90
Mwuahaha, that was my joke. However, it was itself inspired by a comment in my even earlier thread about the new IBM supercomputers, in which somebody asked "But will it play HL2?"


Which was inspired, and correct me if I'm wrong, by the HL2 trailer/demo video where one person asks if his 486 could run the game.
Marsz, marsz, Dąbrowski,
Z ziemi włoskiej do Polski,
Za twoim przewodem
Złączym się z narodem.
2004-10-06, 8:59 PM #14
wow the joke never ends.... and will a x486 with a 6800Ultra O/C work????
2004-10-08, 10:46 AM #15
Quote:
Originally posted by gbk
That, and the fact that the xbox runs a cheezy x86 chip, while the PS2 runs a MIPS RISC chip.


RISC is good. ;)

It (the Toshiba/Sony Emotion Engine) is an RISC chip (as all specialized processors generally are), but not made by MIPS. MIPS made the R4300i in the N64, and is making the R4000s to be used in the PSP.

Quote:
Originally posted by Sol
Just nit picking, but the Xbox uses a "specially designed" GeForce 3 64MB.

I've seen conflicting reports one which generation it is. Either way, it is an NVidia-designed chip that supports up to DirectX 8.1, which means Shader Model 1.1. Also, it does not have any RAM on it; the GeForce chip is built onto the motherboard, and it has 32MB of the 64MB TOTAL PC 133 Memory dedicated to it. Normally, I recall that such GeForce cards use at least PC 2100 DDR, and modern ones use DDR3.

Quote:
Originally posted by Snoopfighter639
wow the joke never ends.... and will a x486 with a 6800Ultra O/C work????

No, it won't. an 80486 chip can only perform integer operations at up to 16 bits of accuracy. Normally, some larger equations can be used in games (such as Doom using 32-bit numbers), but they have a minimum for their heavily used operations. HL2, Doom3, and just about all games back to JK require a 32-bit processor; the first PC chip of these would be an Intel Pentium 75Mhz. There's a chance it might work with that (presuming you hacked the game to think it's enough), but your framerate would be very poor, as it might take over a second to calculate what happens for each frame.
Wake up, George Lucas... The Matrix has you...
2004-10-08, 11:01 AM #16
Quote:
Originally posted by nottheking
It (the Toshiba/Sony Emotion Engine) is an RISC chip (as all specialized processors generally are), but not made by MIPS. MIPS made the R4300i in the N64, and is making the R4000s to be used in the PSP.


I've seen conflicting reports one which generation it is. Either way, it is an NVidia-designed chip that supports up to DirectX 8.1, which means Shader Model 1.1. Also, it does not have any RAM on it; the GeForce chip is built onto the motherboard, and it has 32MB of the 64MB TOTAL PC 133 Memory dedicated to it. Normally, I recall that such GeForce cards use at least PC 2100 DDR, and modern ones use DDR3.


No, it won't. an 80486 chip can only perform integer operations at up to 16 bits of accuracy. Normally, some larger equations can be used in games (such as Doom using 32-bit numbers), but they have a minimum for their heavily used operations. HL2, Doom3, and just about all games back to JK require a 32-bit processor; the first PC chip of these would be an Intel Pentium 75Mhz. There's a chance it might work with that (presuming you hacked the game to think it's enough), but your framerate would be very poor, as it might take over a second to calculate what happens for each frame.


JK runs on a 486... if not, then my laptop 486DX-75 must be lying when it plays at about 15fps.
2004-10-08, 12:08 PM #17
Quote:
Originally posted by Cool Matty
JK runs on a 486... if not, then my laptop 486DX-75 must be lying when it plays at about 15fps.

???

The minimum requirements for JK state that it needs a Pentium 90Mhz... And exactly how did you get arround this?
Wake up, George Lucas... The Matrix has you...
2004-10-08, 12:28 PM #18
Quote:
No, it won't. an 80486 chip can only perform integer operations at up to 16 bits of accuracy. Normally, some larger equations can be used in games (such as Doom using 32-bit numbers), but they have a minimum for their heavily used operations. HL2, Doom3, and just about all games back to JK require a 32-bit processor; the first PC chip of these would be an Intel Pentium 75Mhz. There's a chance it might work with that (presuming you hacked the game to think it's enough), but your framerate would be very poor, as it might take over a second to calculate what happens for each frame.


The 386 was the first 32-bit x86 processor.

There is no way to "hack" JK to run on a 286 or other 16-bit processor. Windows isn't called a "32-bit protected mode OS" just for the fun of it.
2004-10-08, 1:07 PM #19
Quote:
Originally posted by Argath
The 386 was the first 32-bit x86 processor.

You're counting the wrong thing. You're thinking of the bit width of the processor's data path. By that, the 80386 was indeed the first 32-bit processor, and the Pentium was the first 64-bit processor (The Pentium IV is a 128-bit processor, by this account). However, What most companies traditionally refer to when mentioning their processors (such as AMD, and any console maker), they are referring to the level of precision the processor can calculate integer operations to. Generally, this number is half of what it's data path width it; i.e. The 386 was the first 16-bit processor, and the Pentium was the first 32-bit processor.

Quote:
Originally posted by Argath
There is no way to "hack" JK to run on a 286 or other 16-bit processor. Windows isn't called a "32-bit protected mode OS" just for the fun of it.

You also seem to have a different definition. It is quite possible to run non 32-bit applications on a 32-bit OS; this is evident that games designed to run on Windows 3.x (a 16-bit OS) can run just as well on Win 9X/NT (32-bit OSes). However, I don't think JK would escape with much left if one tried to convert it to be a 32-bit program, and then there's the part that you'd need the JK source.
Wake up, George Lucas... The Matrix has you...
2004-10-08, 1:17 PM #20
Quote:
Originally posted by nottheking
The minimum requirements for JK state that it needs a Pentium 90Mhz... And exactly how did you get arround this?


Minimum requirements to run well. I also ran it on a 486, a DX 100. It was terribly slow but it ran.
Bassoon, n. A brazen instrument into which a fool blows out his brains.
2004-10-08, 1:45 PM #21
Quote:
...they are referring to the level of precision the processor can calculate integer operations to. Generally, this number is half of what it's data path width it; i.e. The 386 was the first 16-bit processor, and the Pentium was the first 32-bit processor.


Complete garbage.

The 386 expanded the integer registers to 32-bits and they haven't changed since. Feel free to assemble and run this on a 386 in protected mode:

xor edx,edx
mov eax,1
div 3

*gasp*! It works! And the result is 0, not a crash as you claim it should be.

Changes to the ISA since the 386 have been minor. The only notable ones I can think of are PAE and syscall/sysenter replacing call gates. Other architectural changes, like wider databuses, the Pentium's superscalar core, the Pentium Pro's OOOE support, and so on have improved performance, but they don't have any effect on the ISA. Thus, in most cases, the fastest Pentium 4 and the slowest 386 are compatible.

Quote:
...It is quite possible to run non 32-bit applications on a 32-bit OS; this is evident that games designed to run on Windows 3.x (a 16-bit OS) can run just as well on Win 9X/NT (32-bit OSes). However, I don't think JK would escape with much left if one tried to convert it to be a 32-bit program, and then there's the part that you'd need the JK source.


Duh. Both x86 and Windows were designed to be backwards compatible. x86 also has V86 mode for running real mode programs in protected mode.

JK is already a 32-bit program, just like every DirectX game and practically every other program written in the last decade.
2004-10-08, 1:49 PM #22
i was really just kidding about the 486 thing...
2004-10-09, 1:22 AM #23
Quote:
Originally posted by nottheking
You're counting the wrong thing. You're thinking of the bit width of the processor's data path.
What.
Quote:
By that, the 80386 was indeed the first 32-bit processor, and the Pentium was the first 64-bit processor (The Pentium IV is a 128-bit processor, by this account).
What.
Quote:
However, What most companies traditionally refer to when mentioning their processors (such as AMD, and any console maker), they are referring to the level of precision the processor can calculate integer operations to.
What.
Quote:
Generally, this number is half of what it's data path width it; i.e. The 386 was the first 16-bit processor, and the Pentium was the first 32-bit processor.
What.

You know that trick a lot of clowns do where they pull like 700 feet of ribbon out of their pockets? This is a lot like that. Only instead of ribbon it's bullcrap, and instead of a pocket it's nottheking's butt.
2004-10-09, 1:53 AM #24
Quote:
Originally posted by Argath
xor edx,edx
mov eax,1
div 3

*gasp*! It works! And the result is 0, not a crash as you claim it should be.
Not to mention that there are special instructions for floating point operations. The 386 didn't actually have a built-in floating point unit. Floating point instructions are interpreted by the 387 if available. That code will yield the exact same results on a Pentium 8 EE clocked at 594 THz. Why? Because getting a 0 is mathematically correct. Go figure.

Secondly, what the hell? "Integer calculation accuracy"? You don't really have a lot of room for inaccuracy in integer mathematics. Register size, obviously, determines the maximum value of an integer on the system. And even this isn't always true - there are large integer libraries available for every system that has any sort of memory or storage capacity larger than a single register.

Thirdly, floating points are the inaccurate ones. nottheking seems to have the two horribly and painfully confused. By definition, a floating point decreases in accuracy as the value moves away from 0. It's a tradeoff.

Fourthly, no - 386, 486, Pentium, Pentium Pro, Pentium MMX, Pentium 2, Pentium 3, Pentium 4s are all very much 32-bit processors.
2004-10-09, 8:47 AM #25
Jon`C freaking wins.
D E A T H
2004-10-09, 9:53 AM #26
Quote:
Originally posted by Jon`C
You know that trick a lot of clowns do where they pull like 700 feet of ribbon out of their pockets? This is a lot like that. Only instead of ribbon it's bullcrap, and instead of a pocket it's nottheking's butt.


GOLD.
Bassoon, n. A brazen instrument into which a fool blows out his brains.
2004-10-10, 9:52 AM #27
Of course, nottheking is probably just going to ignore this thread and post the same garbage the next time this topic comes up. Is there a way to beat people to death over the internet?
2004-10-10, 10:02 AM #28
Quote:
Originally posted by Dj Yoshi
Jon`C freaking wins.


And the ribbon part was great :D
2004-10-10, 10:43 AM #29
Quote:
Originally posted by Emon
GOLD.


No, that's orange.
2004-10-10, 10:49 AM #30
Quote:
Originally posted by Thrawn42689
No, that's orange.


GOLD.

Better?

↑ Up to the top!