The static on old CRT TVs with rabbit ears was the cosmic microwave background. No one in the last 25 years has ever seen it.

  • @Aceticon
    link
    331 month ago

    By the way, the picture illustrating the post isn’t actually displaying the real thing - the noise in it is too squarish and has no grey tones.

    • @[email protected]
      link
      fedilink
      171 month ago

      TV static in recent movies and shows that are set in the past almost always instantly pull me out of the narrative because no one seems to be able to get it right and some are just stunningly bad. It’s usually very subtle, so much so that I’m not sure I could even describe what’s wrong. Makes me feel old to notice it.

      • @Aceticon
        link
        14
        edit-2
        1 month ago

        I think the problem is because CRT displays didn’t have pixels so the uniform noise which is static was not only uniformely spread in distribution and intensity (i.e. greyscale level) but also had “dots” of all sizes.

        Also another possible thing that’s off is the speed at which the noise changes: was it the 25fps refresh rate of a CRT monitor, related to that rate but not necessarily at that rate or did the noise itself had more persistent and less persistent parts?

        The noise is basically the product of radio waves at all frequencies with various intensities (though all low) with only the ones that could pass the bandpass filter of the TV tuner coming through (and being boosted up in intensitity by automatic gain control) and being painted along a phosphorous screen (hence no pixels) as the beam draw line by line the screen 25 times per second so to get that effect right you probably have to simulate it mathematically from a starting point of random radio noise and it can’t be going through things with pixels (such as 3D textures) to be shown and probably requires some kind of procedural shader.