Page 2 of 2 FirstFirst 12
Results 41 to 59 of 59

Thread: Back on the 18 month treadmill (anything computer hardware)

  1. #41
    For some reason that post makes me want to write the following:

    Atari announces Jaguar, the first 64 bit console.
    "I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16


  2. #42
    Admiral of Awesome
    Posts
    16,862
    Except, of course, Jaguar was more or less a fraud committed upon consumers, and AMD will truly have quadrupled best available core counts by the time this thread is 6 months old.

  3. #43
    I never claimed any relevance whatsoever.
    "I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16


  4. #44
    Admiral of Awesome
    Posts
    16,862
    I guess, other than comparing real announcements to a fake number that Atari made up.

  5. #45
    Well, Nintendo 64 stuck in my head also. Actually, I wanted the Jaguar so much. I think it was an EB Games or Babbages in one of the malls in Anchorage I would stop by and ogle often. I think it was AvP and Iron Soldier that had me so interested.
    "I would rather claim to be an uneducated man than be mal-educated and claim to be otherwise." - Wookie 03:16


  6. #46
    I remember how I never actually knew what was 64 Bit about the N64. I still don't know and am too lazy to research it.
    Sorry for the lousy German

  7. #47
    Admiral of Awesome
    Posts
    16,862
    Quote Originally Posted by Impi View Post
    I remember how I never actually knew what was 64 Bit about the N64. I still don't know and am too lazy to research it.
    64-bit machine word.

    Edit: “machine word” roughly means “the number size that is most comfortable for the computer”. Like, if you can easily do mental arithmetic with 3 digit numbers, but you have trouble with 4, I’d say that your machine word is 3 digits. It’s the same way for a microprocessor, except instead of digits they use bits.

    In more technical detail, it means the N64’s CPU has general purpose registers or accumulators that are 64 bits wide, has 64 bit load/store instructions, has an ALU (arithmetic and logic unit) that is 64 bits wide, etc.

    Talking about bits was always silly though because even if a CPU could do 64 bit, it wasn’t always a good idea to do so. N64 games were all compiled for 32 bit because it was faster. Even if the ALU is pure 64 bit, a 32 bit executable still makes better use of memory and caches when you don’t need really big numbers.
    Last edited by Jon`C; 11-09-2018 at 03:27 PM.

  8. #48
    ^^vv<><>BASTART
    Posts
    7,768
    Quote Originally Posted by Jon`C View Post
    I guess, other than comparing real announcements to a fake number that Atari made up.
    Summing up the bits in my ram, I have the world's first 16gb system.

  9. #49
    Quote Originally Posted by Jon`C View Post
    Talking about bits was always silly though because even if a CPU could do 64 bit, it wasn’t always a good idea to do so. N64 games were all compiled for 32 bit because it was faster. Even if the ALU is pure 64 bit, a 32 bit executable still makes better use of memory and caches when you don’t need really big numbers.
    Thanks, that would've been my next question. I can't actually imagine that many games made actual use of the 64 bit CPU.
    Sorry for the lousy German

  10. #50
    Admiral of Awesome
    Posts
    16,862
    Quote Originally Posted by Impi View Post
    Thanks, that would've been my next question. I can't actually imagine that many games made actual use of the 64 bit CPU.
    None did. I’ve never developed for the N64, but the official documentation suggests games were all either required or forced to run in 32-bit kernel mode. 64-bit arithmetic operations were available, but the chip was set to use a 32-bit calling convention and a flat 32-bit addressing mode. Kinda like how the original Pentium MMX added 64-bit registers and ALUs, even though the rest of the chip was 32-bit through and through.

    The only reason the N64 was 64-bit is because their vendor, SGI, selected a desktop-grade MIPS R4300, and that chip happened to be 64-bit. There’s a lot of other stuff that’s disabled or unused too; the R4300 was designed to run real-deal operating systems, so it had interrupt timers, protection rings, virtual memory, stuff that isn’t useful for an old school console. They just ignored that stuff. It’s still way cheaper to ship dead silicon than design a custom chip, especially back then.

  11. #51
    From this discussion it sounds like Intel is falling behind AMD. That, and the bad press Intel has been getting surrounding things like the i10. The theme seems to be that Intel is panicking and resorting to cheating and lying.
    Last edited by Reverend Jones; 11-09-2018 at 06:13 PM.

  12. #52
    Admiral of Awesome
    Posts
    16,862
    Intel has been falling behind/cheating/lying for a really long time. For a few years they’ve been giving major customers early access to next gen Xeons to stop them from switching over to ARM. The small customers can’t afford to develop ARM server parts so they’re trapped, and for the big ones having a structural performance/power advantage over upstarts is good enough.

    Intel is very much on borrowed time. Sooner or later they’re going to be forced to spin off their fab and IP businesses, and whether Jim Keller can turn around their CPU business is still an open question. Intel hasn’t been flat footed like this since ever. It’s worse than even the netburst/Athlon XP era because the markets so much bigger and more diverse now, and intel has been shut out of everything except the premium high power single threaded performance market.
    Last edited by Jon`C; 11-09-2018 at 06:10 PM.

  13. #53
    Admiral of Awesome
    Posts
    16,862
    Quote Originally Posted by Jon`C View Post
    Intel has been falling behind/cheating/lying for a really long time. For a few years they’ve been giving major customers early access to next gen Xeons to stop them from switching over to ARM.

    Intel is very much on borrowed time.
    “Today we are launching EC2 instances powered by Arm-based AWS Graviton Processors. Built around Arm cores and making extensive use of custom-built silicon, the A1 instances are optimized for performance and cost.”

    💥

  14. #54
    Yeah, I saw that the other day.

    For hobbyist purposes, my main reason for sticking with x86-64 other than performance to cost is convenience, since just trying to run a Linux desktop OS on an Arm SoC is likely to expose more untested (or unwritten) branches of code (whether in some library, build script, or kernel driver), to the point that even seeking out something like an Arm or MIPS based laptop is not even worth it. Also, a lot of the hobbyist single board computers using Arm SoC's you see on the market (like the so-called "plug computers") still have the usual plethora of hardware bugs in their peripheral cores, but which kernel developers seem NOT to have contributed patches for workarounds to (unless, of course, your Arm SoC happens to be used by Android, but in that case, say hello to various proprietary binary blobs in the kernel just to get things like Wifi and bluetooth working--although I could say the exact same thing about Realtek on x86-64).

    But if Intel chips really do start to show general flaws in comparison to the performance that Arm can achieve with the same level of fabrication technology, then, well, server applications are probably the most trivial to repackage, and you don't even need to worry about all the drivers or whatever weird laptop issue you might have at all: just stick a Docker binary in an AWS instance and do the math for performance / cost, and maybe it'll be the beginning of the end for Intel....
    Last edited by Reverend Jones; 11-28-2018 at 12:17 PM.

  15. #55
    Admiral of Awesome
    Posts
    16,862
    For hobbyist purposes, I'd guess your main reason for sticking to x86-64 is the fact that nobody makes a high performance ARM workstation/desktop.

  16. #56
    Admiral of Awesome
    Posts
    16,862
    I'm a system software developer and I've had to care what microarchitecture I'm using exactly once in the last decade, when I had to make a system call from a library that couldn't link to libc. A normal person might grumble about their old games not working but otherwise they'd never notice.

  17. #57
    Quote Originally Posted by Jon`C View Post
    For hobbyist purposes, I'd guess your main reason for sticking to x86-64 is the fact that nobody makes a high performance ARM workstation/desktop.
    That's true. But even with a Raspberry Pi, you get HDMI and ethernet, which is good enough for a thin client, which is good enough to run Emacs and ssh on a nice screen and basically zero heat / noise / power consumption.

    In my experience though even with an anaemic CPU, the deal-breaker is that you still have ****ty I/O (not just bad performance, but often unreliable components on the SoC that lead to data corruption because the eMMC driver doesn't work around hardware bugs). But probably the worst part of Arm is that because the market is so thoroughly driven by cell phones, gigabit ethernet controllers are basically non-existent (or were a few years ago last I was into this stuff).

  18. #58
    Quote Originally Posted by Jon`C View Post
    I'm a system software developer and I've had to care what microarchitecture I'm using exactly once in the last decade, when I had to make a system call from a library that couldn't link to libc. A normal person might grumble about their old games not working but otherwise they'd never notice.
    Well, I am not saying the software wouldn't actually compile. I just don't want to deal with the build scripts failing to work without modification on a case-by-case basis. Sure, I could just use Debian or Ubuntu, where most OSS has already been packaged for Arm, but there is a long tail of either brittle or abandoned build scripts for software I use but hasn't been recompiled into a dpkg for me against recent libraries, and I am not about to dive into the source just to make sure it doesn't break because of some variable in a shell script or a header file. (And, guess what, if somebody has had the problem on x86, there's a good chance there's a magic answer on SO to get the damn thing to build without needing to know too much of what's actually going on.)

    I mean, if 99% of the system works fine, but there is one package that I really like but refuses to build, then the entire system is much worse for me. This happened to me often enough that I don't bother with Arm for messing around with open source for fun. That probably makes me a bad developer though for clinging to the x86 monoculture for lame reasons of laziness.
    Last edited by Reverend Jones; 11-28-2018 at 02:03 PM.

  19. #59
    ^^vv<><>BASTART
    Posts
    7,768
    .
    Last edited by Reid; 11-28-2018 at 04:20 PM. Reason: wrong thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •