Massassi Forums Logo

This is the static archive of the Massassi Forums. The forums are closed indefinitely. Thanks for all the memories!

You can also download Super Old Archived Message Boards from when Massassi first started.

"View" counts are as of the day the forums were archived, and will no longer increase.

ForumsDiscussion Forum → Back on the 18 month treadmill (anything computer hardware)
123
Back on the 18 month treadmill (anything computer hardware)
2018-08-15, 1:29 AM #1
I'd like to open this discussion with a picture.



Do you see it? No? Here, let me try again:



How 'bout now? Still don't get it? Okay, I'll explain:

You've probably noticed that you don't need to upgrade your computer very often anymore. Upgrades these days are still cool and all, but they're just a difference in magnitude: slightly better framerates, slightly higher graphics settings, but there's never anything groundbreaking about it. A five year old computer should be able to play any new game released today, and a good three year old computer will still be able to play new games on the highest settings.

Things used to be different. Back in the 1990s and early 2000s you had to upgrade your computer every couple of years - ideally every 18 months - and it was a difference in kind. Back then it was expected that a two year old computer would struggle to play a new game. If you wanted to keep playing new, demanding games, you had to buy new hardware - frequently.

What we're looking at in the above pictures is, I think, a return to that 18 month upgrade cycle. It's not going to happen today. Not all of the pieces are there yet, but they're coming, and they're coming a lot faster than I thought they were. The first round of middleware was shown off at GDC 2008 and will take some time to polish, and then we're maybe two or three AAA release cycles after that before the powderkeg explodes.

Save your shekels, son: Doom 5 is going to require real-time hardware raytracing.


1.) CPUs

If you aren't familiar with Threadripper: it's an enthusiast grade, high end desktop/gaming CPU with anywhere from 8 to 32 cores, each with 2 hardware threads (16 to 64 logical processors on a single socket). Threadripper chips are usually clocked slower than lower core count CPUs, trading single-core performance for ridiculous parallel instruction throughput.

Threadripper is, without exaggeration, the most interesting thing to happen in the CPU space since Athlon 64. It's one of those crazy gambits you almost never see out of a big company, this wild guess that, if someone makes a CPU offering this tradeoff, someone else will find something to do with it.

For a consumer part, that 'someone' will almost certainly be a game developer. And for a SISD integer part, that 'something' means gameplay (as in, revolutionary changes to). There are three major areas where modern games are more or less in the stone ages: interactivity, world mutability, and AI. All lend themselves well to CPUs and quite abysmally to GPUs, so my educated guess is that we're going to see revolutionary changes in likely all of these areas. This latter step hasn't happened yet, but the more the core wars heat up, the more likely it does happen.

Speaking of core wars, it's not certain when Intel will become a part of this discussion. They come nowhere close to competing against AMD at the top of the HEDT space, and even where they do have performance competitive chips, Intel's parts are several times more expensive (or an embarrassing fraud). Not actually sure what they're thinking here - maybe their HEDT yields are so low that they actually can't be price competitive?

The good news for Intel is that they've actually been working in this space for a long time: Project Larrabee and Xeon Phi. If the throughput/single thread perf tradeoff gets important enough, they'll at least have something they can bring to market... eventually. Much like their current HEDT strategy, though, Intel whiffed the Xeon Phi consumer market to focus on the crazytown overpriced server market. As usual, they'll probably have to take a protracted beat-down before they change course.


2.) GPUs

Nvidia announced their next-generation workstation cards at SIGGRAPH, with the Turing-based Quadro RTX 8000 as flagship. It's a safe guess that the next-generation GeForce RTX 2080 will be based on this chip (name not officially announced, but teased heavily during the presentation).

Why should you care?

For starters: Based on the numbers presented, I estimate the RTX 2080 could be as much as twice as fast as the GTX 1080 - which is still an incredibly powerful card, more than two years after it released. That's very cool.

More importantly: like the Quadro RTX, it will have hardware accelerated raytracing - Nvidia Gameworks support confirmed. The Quadro RTX 8000 will be capable of 10 billion hardware rays per second, or 60 FPS, full HD at 80 samples per pixel. This brings hardware raytracing forward from some fuzzy, noisy thing occasionally useful for occlusion queries, to something that could actually replace raster graphics entirely. (For comparison, the GTX 1080 can cast 100-125 million rays per second.)

Honestly not sure what else I can say about this. They showed an Unreal Engine 4 real-time raytracing demo at this year's GDC. It involved dozens of servers each with several Quadro cards. Six months later, and we're talking about most of that horsepower on a single card. Holy moly.


3.) VR

Nothing interesting. Here's the thing, though. Major movement in VR fits with my overall thesis:

GPU support for foveated rendering is already there, it just needs some super high res screens and the right eye tracking tech. People are working on that, but they don't exist yet. If that eye tracking tech could also report on accommodation then, along with hardware raytracing, you have something that's just as good as a light field display - for a fraction of the cost. And with foveated rendering cutting down the quality of 99% of the display, you're gonna have an absurd ray budget to spend on the fovea.

The other part is the high core count gameplay revolution. VR games starve for interactivity - static environments are almost painful to be in. Imagine having the CPU budget where you could just, like, pick up a sledgehammer and start tearing down the drywall. Or, some day, having the CPU budget to start talking to an NPC and then they talk back. Crazy, right? Today it is. If the average computer had thousands of CPU cores, maybe not.

And that's the world I see coming for PC hardware. Back in the 1990s and early 2000s, we'd upgrade to get a faster clock speed or more RAM. Then for a teeny tiny little while there in the late 'naughts, it seemed like we were going to the same thing for core counts... but that never happened. I think it's finally going to happen. I think we're going to get games that gobble computer power again, CPUs competing on geometrically increasing core counts, and GPUs competing on geometrically increasing rays per second. And every 18 months, you'll have to upgrade because you don't have enough cores or enough rays. (IMO).
2018-08-15, 1:53 AM #2
Well, since I stopped playing new games at some point, I don't think I'll really ever need a new GPU-based computer anymore.

In fact, my only wish is that FL Studio would get to enabling true multi-core support already, since editing tracks with all the fancy FX plugins active can't really be done when the lag is enormous (due to FL just using one core and their instructions for something reminiscent of multi-core support doesn't help much - namely, I tested it yesterday and FL's CPU "usage" went from 101% to 61%, which still isn't helpful).

Alas.
Star Wars: TODOA | DXN - Deus Ex: Nihilum
2018-08-15, 6:26 AM #3
I don't game much anymore, but if costs drop to a similar price point as they have done up till now, I would really look forward to get a workout in these types of things:

https://www.youtube.com/watch?v=GcheOuVwXb8&t=135s

$10,000 for the RTX 8000 - anyone have any idea for how long such a card would remain out of reach for the average poor PC gamer? Five years? More?
■■■■■■■■
■■■■■■■■
■■■■■■■■
■■■■■■
■■■■■■■■
■■■■■■■■
■■■■■■■■
enshu
2018-08-15, 6:58 AM #4
Isn't the time it takes to make a game with bleeding edge graphics the real bottleneck to what you're talking about, where it becomes necessary to replace your hardware every 18 months? Maybe hardware will enter a phase where it improves significantly faster, but it will only take more time/money to produce games that are able to exploit those improvements.

Right? Or no?
former entrepreneur
2018-08-15, 7:04 AM #5
I went to an old arcade the other day. 15 years ago the machines there would've been state of the art arcade machines, but now it seemed underwhelming, which is too bad, because in a good arcade, you should feel like a little kid experiencing bliss, in a sensory overload, running around with all those flashing lights. It was especially disappointing feeling underwhelmed, because being overwhelmed and feeling like there's more there than you have time to enjoy seems like it's central to the whole arcade experience!

But it got me trying to imagine what a cutting edge arcade built for in 2018 would look like, and I assumed there'd be a lot of VR and physical hardware designed with specific virtual experiences in mind (with VR at home, you need hardware that can accommodate different spaces, so it need to be generalized: you need one controller that can do everything. Presumably, though, at an arcade, you could have controls that are specifically designed for individual games.)
former entrepreneur
2018-08-15, 9:57 AM #6
Fortnite is the most played game in the world and has a pretty low bar for hardware requirements. You can play it on an ipad FFS. It's not a pretty game.

I wonder if this will represent a shift in the current thinking that games must be pretty to be fun (and by extension, profitable).
2018-08-15, 10:04 AM #7
For me, it means that for all of John`C's work for Doom and Quake, Epic Megagames still managed to win.

Or, to quote Unreal Tournament:

DOMINATE!
Star Wars: TODOA | DXN - Deus Ex: Nihilum
2018-08-15, 10:12 AM #8
Look, I just want to open more browser tabs at once without seeing performance hits.
And when the moment is right, I'm gonna fly a kite.
2018-08-15, 10:40 AM #9
Originally posted by Tenshu:
$10,000 for the RTX 8000 - anyone have any idea for how long such a card would remain out of reach for the average poor PC gamer? Five years? More?
Not that long. Quadros are professional workstation cards. They have a lot of memory and certified drivers for CAD. Nothing a gamer would care about, but it adds thousands of dollars to the price.

The Quadro P6000 is $5500 and runs games as well as a 1080 Ti. Despite the very high price, I expect a similar relationship between the GeForce RTX 2080 and the Quadro RTX 8000: similar game and ray tracing performance, but with less memory, uncertified drivers, and the machine learning hardware disabled.

Originally posted by Eversor:
Isn't the time it takes to make a game with bleeding edge graphics the real bottleneck to what you're talking about, where it becomes necessary to replace your hardware every 18 months? Maybe hardware will enter a phase where it improves significantly faster, but it will only take more time/money to produce games that are able to exploit those improvements.

Right? Or no?
What I’m thinking is that we’re entering another phase where it’s trivial for game developers to do this again.

From the 1990s to the early 2000s, you had this arc: fillrate (pixels per second), triangles (triangles per second), shader instructions (FLOPs). Making your game look better was basically as easy as throwing more stuff at the screen. For example, it’s actually easier to make a model with more polygons than less. Not only does it look better, it’s less labor intensive. So for a while you had this synergy between the game companies pumping out geometrically more polygons, and hardware companies making CPUs and T&L cards to render them all.

Rays per second is the same story. Just like fillrate, triangles, and shader ops, using more rays will not only make your game look better, but it’ll actually be easier to use more than less. It’s an easy dial to turn and as long as the hardware exists to support it, game developers have every incentive.
2018-08-15, 2:16 PM #10
Originally posted by gbk:
Look, I just want to open more browser tabs at once without seeing performance hits.


Install an ad blocker.
2018-08-15, 2:17 PM #11
Originally posted by Brian:
Install an ad blocker.


But plz whitelist Massassi. We depend on the ad revenue to keep the site free.
2018-08-15, 3:28 PM #12
plz don't bring back the UGO ads
2018-08-15, 3:30 PM #13
I remember one ad showed a topless (censored) Lara Croft and had information about the long-fabled nude cheat for tomb raider. Brian had to remove it but also cited it as the most clicked ad on the site by far.
2018-08-15, 4:23 PM #14
I don't have much to say, but I do find this quite fascinating so please carry on if you've got more bouncing around in there.

Those real time raytracing demos from GDC were p. neat, and I do remember the followup reports talking about how it'd be years before even that level of RTRT was in affordable consumer video cards. Hmm hm!
2018-08-15, 5:05 PM #15
This actually sounds pretty exciting. New developments in interactivity and AI would make gaming a whole new experience again. With raytracing I'm more neutral. I'm fine with games that look like Oblivion for the most part, what I really want is depth of gameplay.

It's why Dwarf Fortress is still one of my favorite games of all time, despite rarely ever playing.
2018-08-20, 6:54 PM #16
GeForce RTX 2080 Ti announced. 10 Grays/s, same as top end Quadro
2018-08-20, 11:19 PM #17
By the way:

What I'm trying to say here is that these new chips are going to become rapidly obsolete. I'm definitely not posting this stuff to convince people to buy Threadripper 2 and RTX 2080. You really shouldn't buy them.

Right now we're in GeForce 1 or Radeon 9700 territory. New technology level, but by the time there were games that required those hardware features, they were dogs. A 1080 still has a lot of life left in it, so you should upgrade normally. Just don't be surprised if you're buying a new graphics card every couple of years after that. And do not buy the RTX 2080. Unless you're a dev, or you were going to build a new computer anyway.
2018-08-21, 5:10 AM #18
Good thing the mining craze is over, right?
SnailIracing:n(500tpostshpereline)pants
-----------------------------@%
2018-08-21, 8:44 AM #19
I very much doubt that we'll see any real push into games with ray tracing renders until the next console generation.
2018-08-21, 9:55 AM #20
I'd pay the price of an RTX2080* for a thingy that would make programs that use single CPU cores to use multiple ones regardless of any other intervening factor.
Star Wars: TODOA | DXN - Deus Ex: Nihilum
2018-08-21, 8:55 PM #21
instructions unclear, ordered 2x2080ti
gbk is 50 probably

MB IS FAT
2018-08-21, 9:01 PM #22
Originally posted by Nikumubeki:
I'd pay the price of an RTX2080* for a thingy that would make programs that use single CPU cores to use multiple ones regardless of any other intervening factor.


It’s called a superscalar CPU. You already have one. Sorry, there just aren’t that many instructions between RAW data dependencies unless the program was specifically designed for it (i.e. is multithreaded). What you’ve got today is about as good as is possible at the optimizer/instruction dispatch level.
2018-08-21, 9:32 PM #23
Originally posted by Jon`C:
It’s called a superscalar CPU. You already have one. Sorry, there just aren’t that many instructions between RAW data dependencies unless the program was specifically designed for it (i.e. is multithreaded). What you’ve got today is about as good as is possible at the optimizer/instruction dispatch level.


Damn, I was afraid of that! Thanks anyway

Originally posted by NoESC:
instructions unclear, ordered 2x2080ti


Will you play CS 1.3 at 320x240 with all the lowest graphical settings once you receive them? Apparently that's a thing these days (or five years ago, at any rate)..
Star Wars: TODOA | DXN - Deus Ex: Nihilum
2018-08-22, 7:33 AM #24
Being competitive is the only thing that matters and the way to be competitive is to run the game at 5,230 frames per second.
2018-08-22, 7:33 AM #25
The human eye can only see at 6 FPS
2018-08-22, 7:40 AM #26
3 FPS is more cinematic tho
2018-08-22, 8:23 AM #27
common misconception, actually the cinematic look is provided by the vignetting, chromatic aberration, and film grain effects.
2018-08-22, 8:23 AM #28
these are all highly desirable which is why all modern films prominently feature them.
2018-08-22, 8:28 AM #29
games just don't look realistic enough without bokeh, lens flares, orbs, blooming and streaking and all of those other effects that make it look like it was filmed on a cell phone in 2002
2018-08-22, 8:28 AM #30
Sorry, apparently "cinematic" is a trigger word for me.
2018-08-22, 3:28 PM #31
I always thought "cinematic" was just using filters to cover up low polygon count or other shoddy looking architecture.

Then idiots who don't understand why those choices were made recreate them because "that's what big companies do".

I played a hardcore realism shooter recently and the sun and other bright lights have lens flare. Oh, please tell me, oh wise devs, since when do your ****ing eyes produce lens flair?
2018-08-22, 3:32 PM #32
On an unrelated note, I absolutely despise when people use "realism" to describe games. Someone used "realism" recently to justify a ****ty mechanic, in a game where you can bandage gunshot wounds while moving and shooting. They didn't like it when I suggested they remove those features and you should have to deploy with a combat medic to dress field wounds and drag wounded people out of combat.

Games are just ****ing unrealistic, and so you HAVE to make some compromises. I don't get why people don't see this.
2018-08-22, 4:16 PM #33
jonc where is the RTX ON button in Gorc
gbk is 50 probably

MB IS FAT
2018-08-23, 1:15 PM #34
Originally posted by Reid:
I played a hardcore realism shooter recently and the sun and other bright lights have lens flare. Oh, please tell me, oh wise devs, since when do your ****ing eyes produce lens flair?


What really gets me is lens flares in fantasy games where cameras presumably haven't been invented yet. Who was lens
2018-08-24, 12:09 AM #35
How is it that graphics hardware is improving so rapidly at the same time CPUs are straining for marginal performance gains? Isn't graphics hardware subject to the same physical constraints as CPUs?
"it is time to get a credit card to complete my financial independance" — Tibby, Aug. 2009
2018-08-24, 12:58 AM #36
Originally posted by Freelancer:
How is it that graphics hardware is improving so rapidly at the same time CPUs are straining for marginal performance gains? Isn't graphics hardware subject to the same physical constraints as CPUs?


They're subject to the same physical constraints, but not the same logical ones. Performance is logically limited by Read-After-Write data dependencies. Graphics workloads have very few RAW dependencies, and they happen at very well controlled times. The kinds of single-threaded programs you run on your CPU tend to have a RAW dependency every 4 instructions or so, which means even the fastest general purpose single core processor will end up retiring 4 instructions per cycle, even if it's theoretically designed to handle more. In order to increase performance beyond that limit, you need to redesign your programs to have fewer RAW dependencies. This basically turns out to be the same thing as multithreading.

Edit: Back in the 90s/early 2000s Intel and HP collaborated to design a CPU that tried to be parallel the same way a modern GPU is. That chip was called Itanium. The very first Itanium could execute 6 instructions per cycle, which was pretty crazy for the time (Edit 2: latest Intel CPUs do 3, Ryzen can do 4). But the performance benefits never materialized. It's not possible to statically transform a program to reduce or eliminate RAW dependencies. If we could, we'd be able to automatically multithread programs - we wouldn't even need to bother with out of order superscalar, we could literally just throw cores at every program the way a GPU does.
2018-08-25, 3:50 PM #37
I'm still playing new game released on the PC I built 12 years ago. On low settings of course, but still. I'm going for 20.
TAKES HINTS JUST FINE, STILL DOESN'T CARE
2018-08-25, 3:57 PM #38
Originally posted by Roger Spruce:
I'm still playing new game released on the PC I built 12 years ago. On low settings of course, but still. I'm going for 20.


12 years ago I had a Radeon x1950 Pro playing Counter-Strike: Source. Do modern games even support DirectX 9?
2018-08-26, 12:26 PM #39
I have, of course, upgraded my video card once. I forget what the original was but I'm still using a Radeon 5570.
TAKES HINTS JUST FINE, STILL DOESN'T CARE
2018-11-06, 4:49 PM #40
CPU announcement day:

Intel announces Cascade Lake with up to 48 cores per package.

AMD announces Zen 2 Rome with up to 64.
123

↑ Up to the top!