• andrew@lemmy.stuart.fun
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    1 year ago

    If your TV vendor decides to only put 100Mb cards in their TV then unfortunately spikey boy wins and you lose unless you’re willing to downrez your AV catalog.

    • Koffiato@lemmy.ml
      link
      fedilink
      arrow-up
      17
      ·
      1 year ago

      Venn diagram of people who understand this specific technicality and people who don’t want to deal with the shitty TV software is almost a circle though.

      I’d rather get a Android box at the very least…, or just HTPC.

      • andrew@lemmy.stuart.fun
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I’m in that Venn diagram but I’m married with kids and the UX of anything but the TV remote and Plex software is a bit much for me to convince the family to learn. And potentially relearn when I find the next great app like jellyfin 😅

        I think there’s another circle with at least significant overlap between those two of family techies who just can’t convince the rest of the family to care.

        • CancerMancer@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          My wife and kids found Jellyfin easier to use because it more closely resembles Netflix. Your mileage may vary but I get it, and it’s why I even use a media server over just plugging in a laptop with Kodi.

          Sometimes the best solution is whatever you can get the users to actually use.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That’s one solution… unless someone wants to use the computer while you’re watching something, it’s fine. For any shared access TV/computer set up, this falls apart quickly.

          I want my SO to be able to watch something on the TV while I’m playing a game though (and vice versa). Personally all of my stuff is independent, we each have a gaming computer, and the TV ruins separately of all of it. We have a Samsung smart TV and it has a Chromecast attached, so we have options there… but not everyone is set up like me.

          • winterayars@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Nobody’s using this computer except me and nobody uses it for media except during group nights so it’s no problem. Technically it has a PlayStation hooked up to it that could be used for DVDs/Blu-rays but that never happens.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          From a signaling perspective, they’re very very similar. Given that all TVs have HDMI, it may be the only option.

          • uis@lemmy.world
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 year ago

            DVI? Yes, basically HDMI is DVI guarded by patent trolls. DisplayPort? No, it is packet-based.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              The only real benefit to HDMI over DVI is that it carries audio where DVI does not, which is why it’s used on so many TVs. I know DP can do audio too; so I’m not even going to touch on that. DVI however, can do dual-link, which IMO, makes it a much better video format regardless of any patent nonsense.

    • DavidGA@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      2
      ·
      1 year ago

      What the hell are you watching that has a bitrate of >100Mb? Because unless you have a 16K television I suspect the answer is nothing.

      • funktion@lemm.ee
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        I have a 4k blu-ray remux of Misery that has a 104 Mpbs bitrate. But there are only a couple of movies in my collection that break 100. Most of my remuxes are around 50 to 70.

        Anyhoo it’s all moot in terms of network speed since I just use a htpc to play all of them.

      • andrew@lemmy.stuart.fun
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I have plenty with higher bitrate audio that can hit 80. And with the overhead of the rest of the connections, and possibly just some limits on the chipset for TCP overhead etc, it starts stuttering around that 80mbps limit.

        • WaterWaiver@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          80MBit/s audio? How?

          For reference: 2x channels of 16-bit 48KHz raw uncompressed PCM audio (ie “perfect except maybe the noise floor under very very specific circumstances”) is about 1.5MBit/s. Even if you go 96KHz 6 channels (5.1 setup) 24bit uncompressed PCM then it’s only 14MBit + overheads.

          • andrew@lemmy.stuart.fun
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            The audio isn’t 80Mbps, the entire file is. The audio is TrueHD7.1, though. I probably don’t need it but I haven’t bothered transcoding it yet because I’m not exactly out of space or bandwidth.

        • Theoriginalthon@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          My canon ink tank type printer from mid COVID era is the same, didn’t realise it was only 10/100 on the wired port until I was looking at the switch one day and wondered why I had a yellow light instead of green, was about to run a new network cable until I checked the printer

          • MystikIncarnate@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Printers really don’t need even 100mbps though. They’re just not fast enough to spit out the prints your sending even at those speeds. So I get it.

            • Theoriginalthon@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              I get it too, but it was a bit of a shock given that the selling points for everything is bigger better faster stronger, otherwise why would people upgrade. It’s like finding something with a micro USB port on it instead of type c

      • Michal
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Could be something wrong with a cable? A damaged cable can downgrade your connection from gigabit to 100mb

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          Or to 10mbps, half duplex. I’ve witnessed this. My former company was trying to sell a client a new server because it was too slow when I noticed it was only operating at 10/half, instead of the 1000/full that both it and the switch was capable of. Some testing later, and the problem was at the server side cable termination, a quick re-termination and they were up to gigabit. Grabbed a spare run to the switch and connected another cable after verifying it was good and the company went from 10M/half to a LAG of 2000/full in the matter of about an hour.

          The speed complaints stopped.

    • Psythik@lemm.ee
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Is that why my shit keeps buffering any time I try to stream a movie larger than 50-60 GB, despite the fact that I have a gigabit connection and a 2.5Gb router? TIL. BRB, running some speed tests on my TV…

    • SwagGaribaldi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I don’t understand how it’s acceptable for $2,000 TVs to have only 100 mbps ports, wouldn’t it only cost a few cents per unit to upgrade?