Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo…::CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wonder how we ever put up with ‘only’ 240Hz displays?

  • @aaaantoine
    link
    English
    3410 months ago

    On one hand, 360hz seems imperceptibly faster than 240hz for human eyes.

    On the other hand, if you get enough frames in, you don’t have to worry about simulating motion blur.

    • DosDude👾
      link
      fedilink
      English
      7110 months ago

      I never worry about motion blur, because I turn it off. The stupidest effect ever. If I walk around I don’t see motion blur. Cameras see motion blur because of shutter speed, not the human eye.

      • @[email protected]
        link
        fedilink
        English
        3710 months ago

        Umm, well, there is something like motion blur experienced by humans, in fact, your brain creates the time bending effect based on picture 1 and picture 2

        https://www.abc.net.au/science/articles/2012/12/05/3647276.htm

        There is a trick where you watch a clock that counts seconds and turn your head fastly away and back there (or something like that) and you will see, that the rate of seconds seem to be inconsistent

        See “1. CHRONOSTASIS” https://bigthink.com/neuropsych/time-illusions/

        • DosDude👾
          link
          fedilink
          English
          1610 months ago

          Alright. I didn’t know, thanks. Though the human motion blur is vastly different to camera blur in my experience. And games that have motion blur look really unnatural.

          • @[email protected]
            link
            fedilink
            English
            410 months ago

            I don’t know if there is scientific proof that every human experiences “motion bur” the same way. I would bet not.

          • VindictiveJudge
            link
            English
            4
            edit-2
            10 months ago

            More realistic blur smudges things based on how the object is moving rather than how the camera is moving. For example, Doom Eternal applies some blur to the spinning barrels and the ejected shells on the chaingun while it’s firing, but doesn’t blur the world while you’re sprinting.

            • @daellat
              link
              English
              210 months ago

              Yup this is called per-object motion blur and is more common in modern games. I’m still not that big of a fan but I’ve heard good things about it from other high framerate enjoyers

        • @[email protected]
          link
          fedilink
          English
          9
          edit-2
          10 months ago

          On the other hand, humans don’t see in defined frames. The signals aren’t synchronized. So a big part of perceived blurring is that the succession of signals isn’t forming a single focused image. There isn’t really a picture 1 and 2 for your brain to process discreetly. And different regions in your vision are more sensitive to small changes than others.

          A faster refresh rate is always “better” for the human eye, but you’ll need higher and higher panel brightness to have a measurable reaction time difference.

          But hitting really high refresh rates requires too many other compromises on image quality, so I won’t personally be paying a large premium for anything more than a 120hz display for the time being.

          • @[email protected]
            link
            fedilink
            English
            210 months ago

            I agree, human eyes register only change in light in an analog style way (no framerate more something like waves as I understood) compared to cameras, which register all light on every frame. I simplified that part with the “pictures” because I thought it was more understandable like that I guess better would have been something like „your eyes kinda shut down during fast movements of the head and your brain makes up for that by generating a nice transition“

      • Ms. ArmoredThirteen
        link
        fedilink
        English
        10
        edit-2
        10 months ago

        Motion blur in games gives me bad motion sickness and garbles what I’m seeing. I already have a hard enough time processing information fast enough in any kind of fast paced game I don’t need things to be visually ambiguous on top of that

    • Ms. ArmoredThirteen
      link
      fedilink
      English
      310 months ago

      That also depends on the person. Save for really fast moving things I can barely tell the difference between 30 and 60fps, and I cap out at 75 before I can’t notice a difference in any situation. One of my friend’s anything less than 75 gives them headaches from the choppiness.

      • @[email protected]
        link
        fedilink
        English
        810 months ago

        Yeah, personally playing games at 30fps feels disruptively laggy at least for the first few minutes. 60 is good, but the jump to 120 is night and day. I was shocked that going from 120 to 240 was just as noticeable an improvement as the last to me, especially when so many people say they don’t notice it much. Hard to find newer games that give me that much fps though.

  • a1studmuffin
    link
    fedilink
    English
    2910 months ago

    I’d much rather they invest efforts into supporting customisable phones. Instead of just releasing a few flavours of the same hardware each year, give us a dozen features we can opt into or not. Pick a base size, then pick your specs. Want a headphone jack, SD card, FM radio, upgraded graphics performance? No problems, that’ll cost a bit extra. Phones are boring now - at least find a way to meet the needs of all consumers.

    • @[email protected]
      link
      fedilink
      English
      1710 months ago

      Not exactly what you are talking about, but slightly related: the company Fairphone makes phones with parts that can easily be replaced. The philosophy is that you will not have to buy a new phone every 3 years. They do have some customized options aswell (i.e. ram, storage, models) but its limited.

      But going full on optimization with phones, laptops and tablets, similar as a desktop, is just incredibly hard due to the lack of space in the device for the components. As such it makes more sense to offer a wide variety of models, with some customizable options, and then have the user pick something.

      • @[email protected]
        link
        fedilink
        English
        1010 months ago

        On Fairphone, they flat out refuse to even discuss adding a headphone jack (check the posts in their forums - it’s a “hands over ears” no) so I’m sticking with Sony/ASUS (the latter atm as they’ve been slightly less anticompetitive recently but I’d much rather go to a decent company) until they do… It’s not like you notice a phone being 1mm thicker when you have a 3mm case on it anyway

        • @[email protected]
          link
          fedilink
          English
          110 months ago

          Their answer is buying the usb-c to 3mm adapter. If you keep that connecter in you bag, ot connected to your headphones, you should be fine most of the time. Unless you would like to charge and listen to audio at the same time.

          To me, that feels like a solid design choice, but yes we all have our dealbreakers.

          • @[email protected]
            link
            fedilink
            English
            210 months ago

            Solid in the same way the designers heads are solid bone I guess…

            A 3.5mm adapter is not an answer as it causes wear on the USB C port in ways it’s not designed for (but 3.5mm is as it’s circular so the cable rotates and breaks before the port), and it’s hard to get a good dac that isolates the power noise when using a multiple charging/listening adapter that’s also that small

      • @[email protected]
        link
        fedilink
        English
        210 months ago

        My problem with fair phone is that they use old hardware.

        I never replaced any parts on my old phone and only replaced the phone with a new one because it was getting really slow. I replaced the xr with an iPhone 15.

        So my concern with the fair phone is that I’ll replace it with faster hardware more frequently than I would have replaced a no repairable phone that’s faster.

    • stevecrox
      link
      fedilink
      810 months ago

      I wish a company would build 4.5"-5.5" and 5.5"-6.5" flagship phones, put as many features that make sense in each.

      Then when you release a new flagship the last flagship devices become your ‘mid range’ and you drop the price accordingly, with your mid range dropping to budget the year after.

      When Nokia had 15 different phones out at a time it made sense because they would be wildly different (size, shape, button layout, etc…).

      These days everyone wants as large a screen as possible on a device that is comfortable to hold, we really don’t need 15 different models with slightly different screen ratios.

        • @[email protected]
          link
          fedilink
          English
          210 months ago

          Yay I’m part of something! :-)

          I updated to the latest mini iPhone after rumours it would be the last. I hope not, but the trend seems to be bigger and more ridiculous “phone” form factors.

      • @wikibotB
        link
        English
        510 months ago

        Here’s the summary for the wikipedia article you mentioned in your comment:

        Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

        to opt out, pm me ‘optout’. article | about

      • @wikibotB
        link
        English
        210 months ago

        Here’s the summary for the wikipedia article you mentioned in your comment:

        Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

        to opt out, pm me ‘optout’. article | about

  • @[email protected]
    link
    fedilink
    English
    2210 months ago

    Reminiscent of the hi-res audio marketing. Why listen at a measly 24bit 48khz when you can have 32/192?!

    • @[email protected]
      link
      fedilink
      English
      1910 months ago

      These have an actual perceivable difference even if subtle. Hires audio, however, is inaudible by humans.

      • @[email protected]
        link
        fedilink
        English
        1110 months ago

        I tend to agree, but the audiophiles always have an answer to rebuttal it with.

        I’m into audio and headphones, but since I’ve never been able to reliably discern a difference with hi-res audio, I no longer let it concern me.

        • @PastyWaterSnake
          link
          English
          10
          edit-2
          10 months ago

          I’ve bought pretty expensive equipment, tube amplifier, many fancy headphones, optical DACs. A library full of FLAC files. I even purchased a $500 portable DAP. I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files. At this point, it really doesn’t concern me anymore either, but I at least like to see my fancy tube amp light up.

          I will say, though, $300 seems to be the sweet-spot for headphones for me.

          • @[email protected]
            link
            fedilink
            English
            510 months ago

            I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files

            I just keep FLAC around so I can transcode them to new lossy formats as they improve. And so I can transcode aggressively for my mobile when I’m streaming from home, and don’t need full transparency.

          • @pete_the_cat
            link
            English
            310 months ago

            Yeah there’s a clear difference between a pair of $25 or $50 headphones and a pair that cost a few hundred. When I first got my Sony WH1000-XM3s I let my coworker try them and he said “Wow, I didn’t know music could sound this good!”. When I upgraded to the XM4s a few years later I let my brother try them and he was similarly impressed.

            Beyond a few hundred and the thousand dollar range you hit diminishing returns.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            Blackmail – Evon. That’s the one song where I ever heard a difference, though that was ogg, dunno what bitrate I used back then but it was sufficient for everything else. Listening on youtube yep that’s mushy. The noisy goodness that kicks in at 0:30, it’s crisp as fuck on CD.

            …just not the kind of thing those codecs are optimised for I’d say. Also it still sounds fine, just a bit disappointing if you ever heard the uncompressed thing. Which is also why you should never try electrostatic headphones.

        • bitwolf
          link
          fedilink
          English
          610 months ago

          Imo the biggest bump is from mp3 to lossless. The drums sound more organic on flacs whereas on most mp3s they sound like a computer MIDI sound.

          The biggest bump for me was the change in headphones. It made my really old aac 256kbps music sound bad.

          • Kogasa
            link
            fedilink
            English
            210 months ago

            320kbps cbr and v0 vbr mp3 are audibly transparent. Most likely, 250kbps and v2 are too.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            Tried flac vs 192 vorbis with various headphones. E.g. moondrop starfield, fiio fa1, grado sr80x…

            Can’t tell a difference. Kept using vorbis.

        • @[email protected]
          link
          fedilink
          English
          210 months ago

          I’d somewhat call myself an audiophile, just one that cares about actual measurements and audibility, and not snake oil. Haven’t heard a good term for that yet, though.

          Audiophiles also tend to care about some some sort of audio purity, but I’m willing to go wild with EQ, room correction, and impulse responses, which is pretty much the opposite of purity.

      • @[email protected]
        link
        fedilink
        English
        610 months ago

        They have tests you can take to see if you can hear the difference. A lot of people fail! Lol

        • @[email protected]
          link
          fedilink
          English
          310 months ago

          Usually percussion is where it’s easiest to notice the difference. But typically people prefer the relatively more compressed sound!

      • @[email protected]
        link
        fedilink
        English
        110 months ago

        I’d thought I could hear a difference in hires audio, but after reading up on it I’m starting to think it may have been some issue with the tech I was using, whether it be my headphones or something else, that made compressed audio sound veeeery slightly staticky when high notes or loud parts of the track played.
        Personally though, even if it wasn’t, the price for the equipment wasn’t worth it for a difference that was only perceptible if I was listening for it. Not to mention it’s near impossible to find hires tracks from most bands. Most claiming to be hires are just converted low res tracks and thus have no actual difference in sound quality, the only difference being the file is way larger for no good reason.

  • @pete_the_cat
    link
    English
    1810 months ago

    At this point it’s a dick measuring contest.

  • @[email protected]
    link
    fedilink
    English
    1710 months ago

    Well no, because most people aren’t getting them. It’s nice but it’s difficulty to justify spending hundreds on a lightly better screen

      • @TAYRN
        link
        English
        710 months ago

        Cool, so in a few years we’ll have a screen which isn’t better in any noticeable way?

        • @Plopp
          link
          English
          1410 months ago

          Don’t be so negative, imagine a phone screen at 480 Hz. It’ll be great for when you have too much charge left in your battery and need to drain some.

        • Kushan
          link
          English
          710 months ago

          Well it’s more like you’ll get the usable parts without a huge premium. The was a time when monitors faster than 60hz were premium but now it’s pretty common to see 120hz and beyond on even basic monitors.

          There’s still diminishing returns as you go higher, but there’s definitely a noticeable difference between 60hz and 120hz, as well as a less noticeable but noticeable difference between 120hz and 240hz.240hz is becoming more standard now on regular high end monitors and beginning to trickle down too.

          Beyond that in terms of response times, you might not notice a difference between 240hz and 360hz, but image clarity will be better because you’ll get less ghosting just from the virtue of the pixels changing so quickly, so it’s not entirely useless.

          Part of the reason you’re seeing this is because they can. The panel technology (OLED in this case) is super fast due to it’s design, so it’s not too costly to add the necessary hardware to drive those speeds. For LCD tech, you do get to drive the panels faster and harder, that’s why older screens required shitty TN panels to get those refresh rates, but everything else has been around for a while.

  • Lemminary
    link
    English
    1510 months ago

    Finally, a screen with the refresh rate that my cat can enjoy! He sure is gonna love that Tom & Jerry like no other cat that ever lived.

      • Lemminary
        link
        English
        610 months ago

        Yeah, I’m aware. My main concern is that the screen flickers for them and not that the animation is smooth. lol

      • NickwithaC
        link
        English
        210 months ago

        Motion smoothing tech finally has a reason to exist!

  • @[email protected]
    link
    fedilink
    English
    1310 months ago

    I don’t need or want a phone over 90hz, and a pc screen over 180hz. A phone is a waste of battery and a pc screen over that is a waste of money.

    • @[email protected]
      link
      fedilink
      English
      2410 months ago

      Then don’t buy them? With better screens coming out the ones you do want to buy get cheaper.

      Back in the day 144hz screens cost a premium, now you can have them for cheap.

      • @Potatos_are_not_friends
        link
        English
        1010 months ago

        I stopped buying tvs from 2000 until like two years ago, when i saw them on sale for like $200. Been living off of projectors & a home server. I skipped so many “innovations” like curve, flat, HD, 4K, trueColor.

        Weird that it has a OS and that was a shocker.

        I look forward to what TVs bring in 2040.

        • @[email protected]
          link
          fedilink
          English
          910 months ago

          I mean OLEDs are damn amazing image quality wise, but I’m also not a fan of “smart” TVs. The apps can be useful (like native Netflix, Amazon video and so on), but 90% of the time I use my PC over HDMI.

          • @[email protected]
            link
            fedilink
            English
            210 months ago

            I use my chromecast dongle for my smart TV. My smart TV will never get to have an internet connection.

        • @devfuuu
          link
          English
          110 months ago

          You know what’s hot? 3D televisions!!

          I’m so glad that hype died out with people understanding it was stupid. Just thinking about all the ones who bought one.

      • @[email protected]
        link
        fedilink
        English
        -110 months ago

        I didn’t say I wanted them disallowed from being made. Just that it’s dumb to buy them.

        • Dark Arc
          link
          fedilink
          English
          110 months ago

          I think there’s an argument to make screens faster. Graphics have hit a point where resolution isn’t going to give anything of substance… It’s now more about making lighting work “right” with ray tracing… I think the next thing might be making things as fluid as possible.

          So at least in the gaming space, these higher refresh rates make sense. There’s still fluidity that we as humans can notice that we’re not yet getting. e.g. if you shake your mouse like crazy, even on a 144hz the mouse will jump around to different spots it’s not a fluid motion (I’ve never seen a 180hz but I bet the same applies).

          • @[email protected]
            link
            fedilink
            English
            -110 months ago

            You can see it moving a mouse super quick on a static background, but I never notice it happening in games. There’s probably something there a touch noticeable in some fps online games if you really paid attention and could lock your max fps at 120fps with a 240hz monitor, but that would be about it, and I don’t competitively play fps games. I’m perfectly happy with running 60fps at 120hz for myself.

    • @Jimmycakes
      link
      English
      -1310 months ago

      How can you waste a battery? Just charge your phone?

        • @Jimmycakes
          link
          English
          110 months ago

          Ok than how is it a waste of money. I wasn’t using the money for anything else. I only buy 120hz phones since it came out

  • @Snoopey
    link
    English
    1110 months ago

    All I want is a 27/28 inch oled 4k monitor with good hdr. I don’t care about the refresh rate as long a it’s 60Hz+

    • @dai
      link
      English
      910 months ago

      Minimum for me would be 120hz, i’ve been using 120hz since 2012 (12 years… man) and anything less feels like a massive step backwards. My old S10+ and my cheapie laptop feel sluggish in any animated / transmission scenario.

    • bitwolf
      link
      fedilink
      English
      4
      edit-2
      10 months ago

      I’m sticking out with IPS until MicroLED matures enough for me to afford.

      OLED was never designed to be used as a computer monitor and I don’t want a monitor that only lasts a couple years.

      Researchers just designed a special two layer (thicker than current OLED) that doubles the lifespan to 10,000hours at 50% brightness without degrading.

      I’m totally with you on good HDR though. When it works, it’s as night -and-day as 60 -> 144hz felt for me.

        • bitwolf
          link
          fedilink
          English
          2
          edit-2
          10 months ago

          It doesn’t only last for two years, however it begins to degrade after one year of illuminating blue. This would reduce the color accuracy.

          However OLEDs are also very bad at color accuracy across it’s brightness range. Typically at lower brightness their accuracy goes out the window.

          This isn’t as bad on smart phones ( smart phones also apply additional mitigations such as subpixel rotation) however desktop computers typically display static images for much longer and so not use these mitigations afaik.

      • @Snoopey
        link
        English
        110 months ago

        Burn in is a non-issue for regular all-day use. As long as you aren’t displaying a static image at 100% for literally years and actively stopping the screen from running preventative measures, you’ll be fine.

        • bitwolf
          link
          fedilink
          English
          110 months ago

          Can desktop computers do those preventative measures? I haven’t seen any desktop interface for the mitigations Samsung puts on it’s phones.

          Desktops also display static images 100% of the time, unless you change your usage behavior to use full screen all the time.

  • @Donkter
    link
    English
    1110 months ago

    This says “can you tell?” Like I don’t get a new screen once every 10 years maybe and even then the last one I got was used.

    • @Im_old
      link
      English
      210 months ago

      one of my two screens is aiming for 20 years of (intensive!) service. It’s even still in 4:3 format. I will probably replace it in the next couple of years, if the magic smoke doesn’t escape first!

    • @H0neyc0mb
      link
      English
      -210 months ago

      You don’t sound like a mindless consumer, unfortunately, that isn’t most people.

  • @Jumi
    link
    English
    1110 months ago

    I splurged on a 4k 144hz monitor when I worked constant night shifts in covid times and I don’t think I will ever need something else.

    • @Squizzy
      link
      English
      410 months ago

      What is the idea behind 144? It seems to particular a number to be arbitrary. 24, 60 and 120 seem to be based on other techs and related media.

      • @Darthjaffacake
        link
        English
        1110 months ago

        I found people online saying it’s because it’s 24 frames (standard frame rate) higher than 120 meaning it can be used to watch movies using integer scaling (1:6 ratio of frame rate rather than 1:5.5 or something strange), take that with a massive grain of salt though because lots of people say there’s other reasons.

        • Humanius
          link
          English
          910 months ago

          If consuming media with integer scaling is the main concern, then 120Hz would be better than 144Hz, because it can be divided by 5 to make 24Hz (for movies) and divided by 2 or 4 to make 30/60Hz (for TV shows).

          144Hz only cleanly divides into 24Hz by dividing it by 6. In order to get to 60Hz you need to divide by 2.4, which is not an integer.

          And with either refresh rate 25/50Hz PAL content is still not dividable by a nice round integer value

          • @Darthjaffacake
            link
            English
            210 months ago

            Yeah as I said take what I said with a massive grain of salt, some people are saying it’s because of a limit of hdmi data sending so it could be that.

        • @Squizzy
          link
          English
          310 months ago

          Oh man those maths didn’t click with me, of course it’s just another 24 frames.

          • @Darthjaffacake
            link
            English
            210 months ago

            Me neither to be honest, 24 is kind of a weird number when added up.

      • @[email protected]
        link
        fedilink
        English
        6
        edit-2
        10 months ago

        I think it’s because 120hz + overclock can get to 144 so someone probably started selling factory OCd 120hz screens at 144hz and then it caught on. Then someone did the same to native 144hz and we got 165hz. I’m more curious about why 165 was chosen, its not a nice number like 144. Maybe since vrr is widespread now they didn’t need nice numbers

      • @Jumi
        link
        English
        210 months ago

        I honestly have no idea but so far I never really reached 144 fps or 4k, much less both simultaneously.

  • @Copernican
    link
    English
    1010 months ago

    So when are 144hz, 1440p, hdr oleds going to come down in price?

  • @hark
    link
    English
    1010 months ago

    360hz no scope

  • DigitalTraveler42
    link
    English
    1010 months ago

    Isn’t the point that you’re not supposed to be able to tell?

  • @BetaDoggo_
    link
    English
    810 months ago

    It won’t matter until we hit 600. 600 integer scales to every common media framerate so frametimings are always perfect. Really they should be focusing on better and cheaper variable refresh rate but that’s harder to market.

    • @patatahooligan
      link
      English
      1110 months ago

      Well, not really, because television broadcast standards do not specify integer framerates. Eg North America uses ~59.94fps. It will take insanely high refresh rates to be able to play all common video formats including TV broadcasts. Variable refresh rate can fix this only for a single fullscreen app.

    • @[email protected]
      link
      fedilink
      English
      610 months ago

      I mean the 240 I use already does that. So would 360 or 480. No clue why you fixate on 600.

      • @BetaDoggo_
        link
        English
        310 months ago

        600 also scales to PAL standards