• @[email protected]
    link
    fedilink
    -151 year ago

    Have you compared NES games on a CRT with the same games on a modern screen?

    CRTs just look miles better.

    • amio
      link
      fedilink
      33
      edit-2
      1 year ago

      EDIT: OK, it’s ackchually not technically “resolution” per se, I get it. :p

      That’s because the graphics were tailored to CRT resolution - which is to say, [things that just so happened to have] low/outright bad resolution.

      CRTs have advantages over more modern stuff but that’s mostly about latency.

      • @Sylvartas
        link
        291 year ago

        It’s not as much about resolution as it was about exploiting the quirks of CRT. Artists usually “squished” sprites horizontally (because crt screens would stretch them) and used the now famous “half dot” technique to have more subtle shading than what was actually possible at the pixel level. So if you just display the original sprites with no stretch and no “bleed” between pixels, it doesn’t look as good as it should.

        • Zaros
          link
          31 year ago

          I had never heard of this. Fascinating!

      • @grue
        link
        English
        81 year ago

        That’s because the graphics were tailored to CRT resolution - which is to say, low/outright bad resolution.

        No, it’s because the graphics were tailored to the analog characteristics of CRTs: things like having scanlines instead of pixels and bleed between phosphors. If they were only tailored to low resolution they’d look good on a low resolution LCD, but they don’t.

        I admit I’m quibbling, but the whole thread is that, so…

        • amio
          link
          fedilink
          31 year ago

          As a pedant, that is impressive work. Fair enough.

          • @grue
            link
            English
            11 year ago

            Thanks, but @Sylvartas’ reply (which I didn’t read until after writing mine) did a much better job, TBH.

      • @uis
        link
        21 year ago

        CRTs themselve don’t have concept of resolution.

      • @[email protected]
        link
        fedilink
        -21 year ago

        CRTs don’t have pixels so the resolution of the signal isn’t that important. It’s about the inherent softness you get from the technology. It’s better than any anti-aliasing we have today.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          1 year ago

          CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.

          The exact mechanism varied between manufacturers and types: http://filthypants.blogspot.com/2020/02/crt-shader-masks.html

          I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently “soft” is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.

          • @grue
            link
            English
            11 year ago

            CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.

            Would it, though? I’m skeptical.

            If it did, it wouldn’t be because they have “pixels,” though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.

            Otherwise (if it didn’t overheat), it should “work.” The result might look weird if the modulation of the signal didn’t line up with the apertures in the shadow mask right, but I don’t see any reason why sweeping the beam across faster would damage the phosphors. (Also, I’m not convinced a black & white TV would have any problem at all.)

            • @[email protected]
              link
              fedilink
              1
              edit-2
              1 year ago

              It will tend to turn the beam on when it’s off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.

              Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.

              • @Aceticon
                link
                11 year ago

                I remember doing that configuration for X.

                You had to tell it when to change lines, when to start firing on a new line (i.e. you changed lines then waited a bit and only after started sending data), then when to stop firing and finally another wait after which a change lines (so there was some “empty” at the start and at the end of a horizontal electron gun trace and you had to tune those so that the image didn’t start or end outside the screen).

                There was also something similar for the vertical axis - i.e. instructing it to go back to the top and some empty lines at the top and the bottom.

                I wouldn’t say it was convenient (it was a bloody text file with weird-looking numbers and you did run the risk of blowing up the CRT), but it was kinda fun that you could create your own crazy screen resolutions for X once you understood the principle of the thing.

              • @grue
                link
                English
                11 year ago

                When you say “blow up” do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?

                I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.

          • @grue
            link
            English
            11 year ago

            That image is a digital rendering of the raw data, not a photo of how a CRT would render it.

            CRTs were nothing if not the opposite of jagged.

              • @grue
                link
                English
                1
                edit-2
                1 year ago

                “Blurred” is the opposite of “jagged,” though.

                The jaggedness of the 2600 wasn’t because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled – naively, with no antialiasing! – even just to get to NTSC (480 scanlines, give or take).

                So yeah, when each “pixel” is three scanlines tall, of course it’s going to look jagged even after the CRT blurs it!

                • @Blue_Morpho
                  link
                  11 year ago

                  A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn’t good even on computers that could output higher than 2600 resolution.

    • @echo64
      link
      111 year ago

      CRT filters exist now, and with HDR output (or just sending an HDR-enable signal to get tv’s to use the full brightness range) and 4k displays it honestly as good at this point. or better because the only good CRT’s you can get now are pretty small P/BVM and my tv is much bigger than those

    • @[email protected]
      link
      fedilink
      7
      edit-2
      1 year ago

      There are plenty of upscalers with minimal latency that fix that.

      There also isn’t just “CRT” in this space. Professional video monitors give a very different picture than a consumer TV with only the RF converter input.

      If one more under 25 retro fan tells me that RF tuners are the “true experience”, I’m going to drink myself to death with Malort.

      Edit: please don’t tell me you believe CRTs have zero latency. Because that’s wrong, too.

    • @marx2k
      link
      11 year ago

      Compare a PS5 on a modern day large screen 4k TV vs a CRT of your favorite brand from any year.

      If your only use case is playing old consoles, there’s filters for current emulators that fill that need adequately.