• vzq
    link
    fedilink
    arrow-up
    26
    ·
    edit-2
    2 months ago

    deleted by creator

    • aard@kyu.de
      link
      fedilink
      arrow-up
      17
      ·
      1 year ago

      They were interesting, but only good for a very narrow purpose - not really a good thing when the trend back then was going away from special purpose machines toward general purpose.

      intel didn’t plan it to be just a special purpose CPU - but it just ended up that way. That they gave their first customers free Alpha workstations for crosscompiling code as that was faster than native compilation should tell you everything you need to know about suitability of itanic as general purpose system.

      • al177@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I never used Itanium, but I’m guessing that the Alpha workstations also ran x86 code faster than the Itaniums. fx!32 was one of DEC’s marvels that they completely forgot to market.

        • aard@kyu.de
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Yeah, but x86 was relatively cheap. Alpha and Itanium were in a similar price range.

          At that time Alpha belonged to Compaq - and they stopped Alpha development (and canned quite a few good designs which were pretty much ready to go), expecting they’ll be able to replace it with Itanium.

    • rotopenguin@infosec.pub
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Were they marvels, though? Itanium made good business sense in that it would cut AMD out of the market, but it was shit technology. Itanium would have also done a good job of cutting GCC out of the compiler market, which is great news for ICC. If everybody had to buy Intel compilers, boy that would have changed the software market.

      You shouldn’t be making the compiler guess at conditions-on-the-ground that the CPU should be inferring itself, such as “which data dependencies are in cache and could be running OOO right now?”. You shouldn’t be making the compiler spend instructions and memory bandwidth describing this stuff. You shouldn’t be making code that works well on exactly one generation of CPU, one pipeline design, and is trash on the next generation. Once upon a time, MIPS saved a few gates by making three “delay slots” part of the ISA, and that became an albatross as soon as they weren’t a three stage pipeline. Itanium is all about making that kind of design decision everywhere. Itanium is the Microsoft Word of ISAs, where the spec is “whatever my implementation does is the correct thing”

      The immediate failure of the Itanium was the promise that “you are buying a new, more expensive system that runs your current x86 code worse”, and the expectation was that every generation of Itanium would go like that. Just as your software starts getting good, here comes the new chip that will someday make stuff faster, but you will never see that until just about the end of that product cycle.

      • vzq
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        2 months ago

        deleted by creator