• BmeBenji@lemm.ee
    link
    fedilink
    arrow-up
    44
    ·
    9 months ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      29
      ·
      9 months ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        6
        ·
        9 months ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

        • Holzkohlen@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

      • minibyte@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

        I’d settle for 4k @ 120 FPS locked.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          9 months ago

          I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

          With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

          There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

    • flintheart_glomgold@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

      TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

    • bruhduh@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      9 months ago

      Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

      • AngryMob@lemmy.one
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        can doesn’t mean should.

        720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

        to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

        or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

        For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

    • Final Remix@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      *monkey’s paw curls*

      Granted! Everything’s just internal render 25% scale and massive amounts of TAA.

  • LaunchesKayaks@lemmy.world
    link
    fedilink
    arrow-up
    32
    ·
    9 months ago

    Has anyone else here never actually bought a TV? I’ve been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I’m going to be really sad when they kick the bucket.

    • ikidd@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      9 months ago

      Any TV is a dumb TV if you plug a Kodi box in the HDMI and never use the smart trash.

    • woodenskewer@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      9 months ago

      I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it’s a dumb one. I know they sell “commercial displays” that are basically a tv with no thrid party apps or a way to install them.

    • Leg@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      9 months ago

      Yes, people like me buy TVs. I’m the guy who keeps giving away perfectly good TVs to other people because I’ve bought a new one and don’t want to store the old one. I’ve given away 2 smart TVs so far, though I’m not sure what I’ll do with my current one when I inevitably upgrade.

    • lengau@midwest.social
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      9 months ago

      I’ve bought my TVs because all my relatives are the same as us. My mom finally tossed an old CRT TV a couple of years ago because it started having issues displaying colours correctly.

    • starman2112@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      9 months ago

      I used my family’s first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it’s pretty nice having so much screen, but I’m never getting rid of the ol’ Sanyo.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the ‘smart’ TV era though.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    9 months ago

    One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a ‘smart’ TV and neither connects to the internet.

    I will use them until they can no longer be used.

    • AngryCommieKender@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      The last TV I owned was an old CRT that was built in the 70s. I repaired it, and connected the NES and eventually the SNES to it. Haven’t had a need for a TV ever since I went to university, joined IT, and gained a steady supply of second hand monitors.

  • CrowAirbrush@lemmy.world
    link
    fedilink
    arrow-up
    26
    ·
    9 months ago

    We are at a point where 4k rtx is barely viable if you have a money tree.

    Why the fuck would you wanna move to 8k?

    I’m contemplating getting 1440p for my setup, as it seems a decent obtainable option.

      • CrowAirbrush@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        And getting the newest gpu every year because they lock you out of the most recent dlss update when you don’t upgrade to the newest line up right?

          • CrowAirbrush@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            9 months ago

            Make it piratable then, i ain’t getting no subscription on hardware functioning.

            Fuck that shit to the high heavens and back.

            • T00l_shed@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              9 months ago

              Oh I’m sure some folks will figure out how to pirate it for sure. But as long as big businesses pay the sub fee, Nvidia won’t give a shit about us.

    • michael_palmer@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      9 months ago

      You can play not only 2023-2024 games. I play GTA V with ultra setting and have 4k@60 FPS. My GPU is 150$ 1080ti.

      • And009@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        If we’re comparing the latest tech then I’d like to be playing the most recent gen games. GTA V feels as old as San Andreas, in a few years my phone should be running it fine.

  • HEXN3T@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    23
    ·
    9 months ago

    I have a 4K 120hz OLED TV. The difference is quite drastic compared to my old 1080p LED. It’s certainly sharper, and probably the practical limit. I’ve also seen 8K, and, meh. I don’t even care if it’s noticable, it’s just too expensive to be worthwhile. We should just push more frames and lower latency for now, or, the Gods forbid, optimise games properly.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      21
      ·
      9 months ago

      I feel like resolution wasn’t much of an issue even at 1080p. It was plenty. Especially at normal viewing distances.

      The real advantages are things like HDR and higher framerates including VRR. I can actually see those.

      I feel like we’re going to have brighter HDR introduced at some point, and we’ll be forced to upgrade to 8K in order to see it.

      • HandBreadedTools@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        9 months ago

        Ehhhh, I think 1080p is definitely serviceable, it’s even good enough for most things. However, I think 1440p and 4k are both a pretty noticeable improvement for stuff like gaming. I can’t go back to 1080p after using my 3440x1440 monitor.

        • Blackmist@feddit.uk
          link
          fedilink
          English
          arrow-up
          4
          ·
          9 months ago

          I notice more on my PC. Being up close I can see individual pixels. And for productivity software, the higher resolution wins every time.

          On a 55" TV, sitting 3 metres away, no real difference for me. I’d rather have extra frames than extra pixels.

          And that’s for gaming. With good quality video, I can’t see any difference at all.

      • Honytawk@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        Depends entirely on the size of the screen.

        A normal monitor is fine on 1080p

        But once you go over 40", a 4K is really nice

    • Neil@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      9 months ago

      I’ve heard recently that there’s “cheap OLED” and “expensive OLED.” Which one did you go for? I’ve got a 75" 4k OLED for $400 and it’s definitely super dark. I can’t even watch some movies during the day if they’re too dark. The expensive ones are supposed to be a lot better.

      • Venat0r@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        I’ve got an older Sony bravia A9G and I’ve seen reviews complaining that it’s too dim but I’ve had no issues. I think some people just have really poorly thought out tv placement, or overly bright rooms. Also just close the curtains if the movie is dark…

        If you want to watch tv outside in direct sunlight you’ll need to follow this guide to build a custom super bright tv: https://youtu.be/WlFVPnGEb8o

    • johannesvanderwhales@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      Too expensive both in terms of the price, and the massive amount of storage needed for 8k video. I don’t really think 8k is ever going to be the dominant format. There’s not really much point in just increasing resolution for miniscule gains that are almost certainly not noticeable on anything but a massive display. Streaming services are going to balk at 8k content.

  • LoudWaterHombre@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    16
    arrow-down
    1
    ·
    9 months ago

    My takeaway from this comment section is that smart TVs are straight from hell and should be treated as such. It is very important, that you get a TV BEFORE smart TVs were a thing.

    • johannesvanderwhales@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      9 months ago

      Display technology has advanced quite a bit since smart tvs have become ubiquitous, though. So you are sacrificing quality to avoid those headaches.

      Personally I just don’t give my smart TV an internet connection.

      • LoudWaterHombre@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        That’s what I did too. It has no connection and I don’t use any of the smart TV features. Instead I have my own box I’m using. I never felt this stupid.

    • Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      6
      ·
      9 months ago

      Nah, you can buy new TVs

      Just make sure they can be used without network and then never connect them to the internet.

      Bought a new TCL recently, none of the smart features work, but got excellent screen quality with all the new specs.

  • ivanafterall@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    9 months ago

    A few years ago, I got a good deal on a 4K projector and setup a 135" screen on the wall. The lamp stopped working and I’ve put off replacing it. You know what didn’t stop working? The 10+ year old Haier 1080p TV with a ding in the screen and the two cinder blocks that currently keep it from sliding across the living room floor.

    • Domi@lemmy.secnd.me
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      The lamp stopped working and I’ve put off replacing it.

      If you still have it, do it. Replacing the lamp on a projector is incredibly easy and only takes like 15 minutes.

      If you only order the bulb without casing it’s also very cheap.

  • clearleaf@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    9 months ago

    The performance difference between 1080p and 720p on my computer makes me really question if 4k is worth it. My computer isn’t very good because it has an APU and it’s actually shocking what will run on it at low res. If I had a GPU that could run 4k I’d just use 1080p and have 120fps all the time.

    • Chestnut@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      9 months ago

      Tldr: Higher resolutions afford greater screen sizes and closer viewing distances

      There’s a treadmill effect when it comes to higher resolutions

      You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

      The reason to upgrade to a higher resolution is because you want bigger screens

      If you want a TV for a monitor, for instance, you’ll want 4k because you’re close enough that you’ll be and to SEE the pixels otherwise.

      • Johanno@feddit.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        As long as don’t know that there is anything better you will love 1080p. Once you have seen 2k you don’t want to switch back. Especially on bigger screens.

        On the TV I like 1080p still. I remember the old CRT TVs with just bad resolution. In comparison 1080 is a dream.

        However if the video is that high in quality you will like 4k on a big TV even more. But if the movie is only 720p (like most DVDs or streaming Services) then 4k is worse than 1080p you need some upscaling in order to have a clear image now.

      • Flying Squid@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

        This is sort of how I feel about 3D movies and why I never go to them. After about 20 minutes, I mostly stop noticing the 3D.

    • pishadoot@sh.itjust.works
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      9 months ago

      1440p is the sweet spot. Very affordable these days to hit high FPS at 1440 including the monitors you need to drive it.

      1080@120 is definitely low budget tier at this point.

      Check out the PC Builder YouTube channel. Guy is great at talking gaming PC builds, prices, performance.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      9 months ago

      Whatever the resolution of ‘real life’, what matters is at what point our little eyes and brains no longer can perceive a difference.

      In average scenery, the general consensus is about 60 pixels per degree of vision. If you have something a bit more synthetic, like a white dot in empty space, then that sort of specific small high contrast would take maybe 200 pixels per degree to ensure that the white dot is appropriately equally visible in the display versus directly seeing. A 75" display 2 meters out at 4k is about 85 pixels per degree. This is comfortable enough for display.

      Similar story with ‘frames per second’. Move something back and forth really fast and you’ll see a blurry smear of the object rather than observing it’s discrete movement. So if you accurately match the blurring you will naturally see and do low persistence backlight/display, you’ll get away with probably something like 60 FPS. If you are stuck with discrete representations and will unable to blur or turn off between meaningful frames, you might have to go a bit further up, to like 120 or 144 FPS.

    • MeDuViNoX@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      I feel like it’s kinda infinite, because you can zoom in to the quantum level and then looking at things sorta fails you… But I’m no scientist.

    • datendefekt@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      The question isn’t how high the resolution of reality is, but how well we can process it is. There is an upper limit to visual acuity, but I’d have to calculate what an arc-minute at 6 meters would be and I’m too lazy right now. Regarding fps, some people can notice artefacts up to 800hz, but I’d think going with 120hz would be ok. Remember, you’ll have to generate stereoscopic output.

    • Ephera@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Both are practically infinite, or well, the question doesn’t really make sense.

      Reality isn’t rasterized, so there’s no resolution. You just have light waves bouncing off of things and into your eyes. They can hit at all kinds of angles and positions, and your brain will interpret different impact frequency distributions as some color or brightness you see in a certain position.

      And you don’t have a shutter in your eyes or something else that would isolate individual frames. Light waves just arrive whenever they do and your brain updates its interpreted image continuously.

      So, in principle, you can increase the resolution and display rate of a screen to infinity and you’d still perceive it differently (even if it’s not noticeable enough to point it out).
      The cost just goes up ever more and the returns diminish, so the question rather has to be, whether it’s worth your money (and whether you want to sink that much money into entertainment in the first place).

    • PriorityMotif@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      It doesn’t matter what refresh rate it says on the box. It depends on the hardware and firmwar inside. I’ve seen good 60hz tvs and I’ve seen them with blurry motion that is borderline unwatchable. There’s some really cheap tvs out there now and there’s a reason they’re so cheap.

  • kandoh@reddthat.com
    link
    fedilink
    arrow-up
    7
    ·
    9 months ago

    Televisions are one if the few things that have gotten cheaper and better these last 20 years. Treat yourself and upgrade.

    • BowtiesAreCool@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      9 months ago

      Except they turned into trash boxes in the last couple of years. Everything is a smart TV with ad potential and functionality that will eventually be unsupported. I’m holding onto my dumb TVs as long as I can.

      • thatKamGuy@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        We’ve got a pair of LG C1 OLEDs in the house, and the best thing we did was remove any network access whatsoever. Everything is now handled through Apple TVs (for AirPlay, Handoff etc.), but literally any decent media device or console would be an upgrade on what manufacturers bundle in.

      • Godnroc@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        Yup. Those cheap TV’s are being subsidized by advertisements that are built right in. If you don’t need the smart functionality, skip connecting it to the Internet. (If you can. Looking at you Roku TV’s!)

      • voxel@sopuli.xyz
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        9 months ago

        well you can just not connect it to the internet and still have some extra features.
        also if it’s an android tv, it’s probably fine (unless you have one with the new google tv dashboard)
        these usually don’t come with ads or anything except regular google android tracking, and you can just unpin google play movies or whatnot.

    • pedz@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      But be careful of the “smart” ones. If you have a “dumb” one that is working fine, keep it. I changed mine last year and I don’t like the new “smart” one. IDGAF about Netflix and Amazon Prime buttons or apps. And now I’m stuck with a TV that boots. All I want is to use the HDMI input but the TV has to be “on” all the times because it runs android. So if I unplug the TV, it has to boot an entire operating system before it can show you the HDMI input.

      I don’t use any “smart” feature and I would very much have preferred to buy a “dumb” TV but “smart” ones are actually cheaper now.

      Same for my parents. They use OTA with an antenna and their new smart TV has to boot into the tuner mode instead of just… showing TV. Being boomers they are confused as to why a TV boots into a menu where they have to select TV again to use it.

      New TVs may be cheap, but it’s because of the “smart” “spying” function, and they are so annoying. I really don’t like them.

      • starman2112@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        Can’t speak for your TV, but mine takes all of 16 seconds to boot up into the HDMI input from the moment I plug it in, and there’s a setting to change the default input when it powers on. I use two HDMI ports so I have it default to the last input, but I have the option to tell it to default to the home screen, a particular HDMI port, the AV ports, or antenna

        Not a fan of the remote though. I don’t have any of these streaming services, and more importantly I’ll be dead and gone before I let this screen connect to the Internet

      • WarmSoda@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        Yeah the bootup kills me. I got lucky that my current tv doesn’t do it. But man the last one I had took forever to turn on. It’s stupid.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      9 months ago

      I’ve been in stores which have demonstration 8K TVs.

      Very impressive.

      I’m still fine with my 720p and 1080p TVs. I’ve never once felt like I’ve missed out on something I was watching which I wouldn’t have if the resolution was higher and that’s really all I care about.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        I think the impressive is likely more to do with other facets than the resolution. Without putting your face up to the glass, you won’t be able to discern a difference, the human visual acuity limits just don’t get that high at a normal distance of a couple of meters or more.

        I’d rather have a 1080P plasma than most cheap 4K LCDs. The demonstrators are likely OLED which mean supremely granular conrol of both color and brightness, like plasma used to be. Even nice LCDs have granular backlighting, sometimes with something like a 1920x1080 array of backlight to be close enough to OLED in terms of brightness control.

      • kandoh@reddthat.com
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        I have a 4k tv with backlighting that matches the screen. When I take magic mushrooms and watch it I can see god

  • Cowbee [he/him]@lemmy.ml
    link
    fedilink
    arrow-up
    6
    ·
    9 months ago

    4k is the reasonable limit, combined with 120 FPS or so. Beyond that, the returns are extremely diminished and aren’t worth truly considering.

      • Cowbee [he/him]@lemmy.ml
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        There are legitimately diminishing returns, realistically I would say 1080p would be fine to keep at max, but 4k really is the sweet spot. Eventually, there is a physical limit.

        • starman2112@sh.itjust.works
          link
          fedilink
          arrow-up
          4
          ·
          9 months ago

          I fully agree, but I also try to keep aware of when I’m repeating patterns. I thought the same thing about 1080p that I do about 4k, and I want to be aware that I could be wrong again

          • Cowbee [he/him]@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            9 months ago

            Yep, I’m aware of it too, the biggest thing for me is that we know we are much closer to physical limitations now than we ever were before. I believe efficiency is going to be the focus, and perhaps energy consumption will be focused on more than raw performance gains outside of sound computing practices.

            Once we hit that theoretical ceiling on the hardware level, performance will likely be gained at the software level, with more efficient and clean code.

    • N-E-N@lemmy.ca
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      9 months ago

      4K id agree with, but going from 120 to 240fps is notable

    • BorgDrone@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      It’s a chicken/egg problem. We need 8k so we can use bigger TV’s, but those bigger TV’s need 8k content to be usable.

      • Holzkohlen@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        9 months ago

        What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV. Only suckers buy into 8k. Same people who bought those rounded screen smartphones thinking it will be the new thing. Where are those phones now?

        • BorgDrone@lemmy.one
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          9 months ago

          What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV.

          You misunderstand the point of higher resolutions. The point is not to make the image sharper, the point is to make the TV bigger at the same sharpness. This also means the same viewing distance.

          At the end of the CRT ear I had 28” TV, at PAL resolutions that is ~540p visible. At the end of the HD era I had a 50” TV. Note that the ratios between resolution and size are close together. Now we’re partway through the 4k era and I currently have a 77” 4k TV. By the time we move to the 8k era I expect to have something around 100”. 8k would allow me to go up to a 200” TV.

          I sit just as far from my 77” TV as I sat from my 27”, my 50” or my 65”. The point of a larger TV is to have a larger field-of-view, to fill a larger part your vision. The larger the FoV the better the immersion. That’s why movie theaters have such large screens, that’s why IMAX theaters have screens that curve around you.

          Don’t think of an 8k TV as sharper, think of 4k as a cropped version of 8k. You don’t want to see the same things sharper, you want to see more things. Just like when we went from square to widescreen TV’s. The wider aspect-ratio got us extra content on the side, the 4:3 version just cut of the sides of the picture. So when you go from a 50” 4k to a 100” 8k, you can see this as getting a huge additional border around the screen that would simply be cut off on a 4k screen.

          Of course, content makers need to adjust their content to take into account this larger field-of-view. But that’s again a chicken/egg problem.

          The endgame is to have a TV that fills your entire field-of-view, so that when you are watching a movie that is all you see. As long as you can see the walls from the corners of your eye, your TV is not big enough.