• DragonTypeWyvern@literature.cafe
      link
      fedilink
      English
      arrow-up
      22
      ·
      7 months ago

      I’m shocked that a meme creator that used Soldier Boy as the Glorious Past had nostalgia goggles on!

      Ok, well, not that shocked, but honestly I don’t remember it happening either.

    • Ashelyn
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 months ago

      Who would win: Bloatware on mass produced pre-builts, or a thumb drive with Windows Media Creation kit?

  • frezik@midwest.social
    link
    fedilink
    English
    arrow-up
    34
    ·
    7 months ago

    It’s almost like having double frame buffers for 720p or larger, 16 bit PCM audio, memory safe(ish) languages, streaming video, security sandboxes, rendering fully textured 3d objects with a million polygons in real time, etc. are all things that take up cpu and ram.

    • reddig33@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      7 months ago

      I didn’t realize web browsing in Chrome required fully textured 3D objects. Not to mention playing 720p video with PCM audio in a separate app doesn’t grind everything to a halt.

      • voxel@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        7 months ago

        well the gpu doesn’t care if it’s 2d or 3d, but you are rendering a whole bunch of textured triangles… (separated into tiles for fast partial or multithreaded re-rendering), and also just-in-time rasterizing fonts, running a complex constraint solver to lay out the ui, parsing 3 completely separate languages, communicating using multiple complex network protocols, doing a whole bunch interprocess communication in order to sandbox stuff

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        There are shared libraries that have to be loaded regardless of you having a tab that uses them or not.

    • MonkderDritte@feddit.de
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      7 months ago

      Are you talking about games? There, it’s mostly textures.

      Web, that’s a whole other story, why it uses so much RAM.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 months ago

        WebGL means the browser has access to the GPU. Also, the whole desktop tends to be rendered as a 3D space these days. It makes things like scaling and blur effects easier, among other benefits.

  • HEXN3T
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    7 months ago

    “Any SNES game” is pretty much just F-ZERO.

    Actually, fun fact: F-ZERO runs at a locked 60FPS for every single release. SFC, N64, GBA and GC. It’s some really impressive stuff for N64.

  • Emerald@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 months ago

    I have a ThinkPad T61, a laptop from 2007, with only 4 GB of RAM. I can open Firefox with 10 tabs, including a Youtube video at 480p, and still have 1GB of RAM left. Yet people act like 8GB is unusable these days.

  • brokenlcd@feddit.it
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 months ago

    If we need to get into this kind of debate; may i remember everyone that the computer that brought humanity on the moon had 2k of ram

    • Album@lemmy.ca
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      7 months ago

      The console ran at 60 on NTSC, and 50 for PAL. Divide by two to get the standard.

        • RightHandOfIkaros@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          7 months ago

          The Super Nintendo’s interlaced video mode was basically never ever used. It could output 60Hz and more than often did.

          Only some games had limited framerate for various reasons, such as Another World being limited by cartridge ram or Star Fox being limited by the power of the SuperFX. Yoshis Island also used the SuperFX and wasn’t limited like Star Fox was. Occasionally there was slowdown if a developer put too much on screen at once, but these were momentary and similar to today when a game hitches while trying to load a new area during gameplay.

    • Refurbished Refurbisher@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      11
      ·
      7 months ago

      At 480i. SNES used 240p, which is technically not standard NTSC, but compatible. Nintendo called this “double strike”, since each field would display in the same location.

      • Refurbished Refurbisher@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 months ago

        It is 59.94 fields per second, translating into 29.97 FPS. Interlaced video is fun. Reason why it’s not a round 60 or 30 FPS is due to maintaining compatibility with black and white sets.

        240p uses each field as a frame, though, while still maintaining compatibility with NTSC. This is what most consoles pre-6th generation uses (same with PAL, but 288p at 50 FPS)

          • Blue_Morpho@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            7 months ago

            Interlacing is native to US broadcast TV. Crt’s don’t have to be interlaced. Computer CRT’s were rarely interlaced.

            • partial_accumen@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              7 months ago

              Okay fine, be particular and ignore the context. Interlacing is native on CRT displays WHEN DISPLAYING NTSC OR PAL, which is what SNES was made for.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                7 months ago

                I’m just being nitpicky because you are using CRT interchangeably with Television. CRT’s are used in TV’s but aren’t interlaced unless the circuitry around them sends interlaced. So no, interlacing is not native on CRT’s when receiving an interlaced signal. If I plugged a Nintendo into my old ViewSonic CRT, I wouldn’t get a signal because it didn’t support NTSC interlaced input.

                It’s like saying interlacing is native on LCDs. LCD TVs are interlaced, not LCDs.

                • partial_accumen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  7 months ago

                  I’m just being nitpicky because you are using CRT interchangeably with Television.

                  That was intentional on my part because of the audience and good communication. You’re technically correct, but without a paragraph of tangential and irrelevant explanation your audience isn’t going to understand you. Modern parlance usage of “television” isn’t the CRT appliance, its any appliance that shows the moving pictures and sound content of television programming. If you walk into any store today and buy a TV, you’re going to get an LCD, AMOLED, or quantum dot display. None of those are CRTs, yet everyone born after about 2002 will associate a TV or Television with a flat panel non-CRT display.

                  So no, interlacing is not native on CRT’s when receiving an interlaced signal.

                  And in nobody’s mind was the vision of plugging a SNES into a computer monitor CRT. You introduced that idea only to show how its wrong. You win at pedantry, but lose at communication.

                  If someone says to you “I’m watching TV”, do you poke your head around the back of the unit to make sure it has a tuner in it and if it doesn’t you quip back to correct them “You’re not actually watching a TV, you’re watching a monitor. A TV requires a tuner, which this unit does not have, making it a monitor, not a TV”?

    • ShortFuse@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      Even interlaced it’s still 60 frames per second.

      Sure they were technically 30 “fields” per second, but most games updated 60 times a second, even SMB on NES. You only saw one half of what the internal console rendered which is an output issue, not a rendering one.

      Add on 480p and you get both 60 frames and 60 fields per second

  • xkforce@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    7 months ago

    3 pixels on the screen that you have to squint at and use your imagination.

      • RayOfSunlight@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        Huh, interesting, still, funny how a GBA is more powerful than the SNES, lots of respect for both though, even if i didn’t got to try them before