• vane@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    4 hours ago

    If it’s not usb-c it’s banned in EU. Because we stopped there and we won’t go forward.

    • MDCCCLV@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 minute ago

      I think you could have a second connector in addition to a main USBC.

      Honestly we need higher capacity for screen cables for PC. Both HDMI and display port are limiting performance because of their low, 40-80gbps, bandwidth. Their performance maxes out at 4k120hz with uncompressed HDR color. You can’t use 8k screens or multiple 4k screens without lowering quality.

    • null_dot@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      4 hours ago

      the GPMI cable comes in two flavors — a Type-B that seems to have a proprietary connector and a Type-C that is compatible with the USB-C standard

      I actually copied this from the article to come here to the comments and have a whinge about all the different USB-C standards, and here you are explaining the reason why.

    • NeonKnight52@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 hours ago

      Actually? I don’t know much about that legislation. Does it really not have room built-in for tech improvements?

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    edit-2
    10 hours ago

    Imagine putting out a new high bandwidth cable standard in 2025 based on copper.

    The sooner display and networking move to SFP, the better.

        • gmtom@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 hours ago

          So if you have a beefy psu you should be able to power your monitor off tbe DP?

          Or does carrying power limit data throughput?

          • CandleTiger@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            15 minutes ago

            The way it works for power over Ethernet — and I assume USB power delivery must work the same way — is that it does not reduce bandwidth because they run the power and the signal over the same wires at the same time.

            There is a a power injector at one end and a filter at the other end that separate out the high-frequency signal and the DC (no-frequency) power into different wires.

            This is essentially the same thing as they’re already doing for multi-frequency stacking on those same wires (and on fiber) to get the crazy bandwidth in the first place. DC power is just one more low (very very low) frequency running on the same stack.

                • kayzeekayzee@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  33 minutes ago

                  Based on this pin configuration, there’s only two dedicated power pins, which isn’t very good for large wattages. The rest are twinax signal pairs separated by ground to reduce crosstalk.

                  Usually when connectors are designed for power delivery, they’ll use bigger contacts to reduce the contact resistance (signal contacts tend to be small so you can fit more of them in the same space). I’m guessing the original DP connector form factor wasn’t made with such high power in mind, so it would make a lot of sense to use the spare signal pins for power delivery in this case. Running too much power through too few small pins can damage the contacts, by either by instant-welding the contact surfaces or by overheating the connector (see NVIDIA GPUs) ((also high voltages can cause arcing, which even in the best case will seriously degrade any connector)).

                  Take all of this with a huge grain of salt cause I just learned this stuff like a month ago, and my department has nothing to do with any of it. Just though someone might find it interesting.

  • Funwayguy@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    13 hours ago

    Running that much power next to a data line sounds like a terrible idea for signal integrity, especially if something shorts to said data lines. It just sounds sketchy or filled with so many asterisks that it’s functional impossible to reach their claimed throughput.

    • null_dot@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 hours ago

      See, IDK anything about data and power and cables but I dislike the vibe when I dock my laptop with that itty bitty USB-C connector that does power and 2x monitors and networking and peripherals.

      I did buy the bonkers expensive proper cable from lenovo, and it does generally just work, but maybe once every few weeks I have to unplug & re-plug.

      More power and more data through the same cable just seems daft.

    • YiddishMcSquidish@lemmy.today
      link
      fedilink
      English
      arrow-up
      21
      ·
      13 hours ago

      It’s likely dc current which without the alternating magnetic fields will not degrade the signal as bad. But I whole heartedly agree with you on power delivery. What could possibly need/use that much power‽

      • SmoothLiquidation@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        13 hours ago

        The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.

        Maybe you could use this to daisy chain monitors and power them all.

        • IsoKiero@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          ·
          11 hours ago

          The option to run one cable to the monitor, or reversely charge your laptop with one docking cable.

          USB-C docks can already do this. Obviously with less power and it’s not perfect by any means, but we don’t need another technology for this. And sure, it’s two cables, one from wall outlet to integrated dock/monitor and usb-c from dock to laptop, but no matter the technology you still need something to plug in to wall outlet.

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 hours ago

        Displayport and hdmi are either twisted pair or coaxial I think. Low frequency RF from 50hz AC shouldn’t interfere with them, but high frequency changes in current on a power wire will.

    • amorpheus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 hours ago

      USB standard is up to what, 40Gbps and 240W? That’s pushing the envelope already. We’ll see if this new standard can prove itself, anyways.

  • drspod@lemmy.ml
    link
    fedilink
    English
    arrow-up
    52
    arrow-down
    2
    ·
    15 hours ago

    This must be for commercial displays where it is beneficial for installation to have power and data over a single cable.

    I can’t think why I would want power delivery to my PC monitor over the display cable. It would just put extra thermal load on the GPU.

    • IrateAnteater@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      33
      ·
      15 hours ago

      I think it’s aimed at TVs in general, not computer monitors. Many people mount their TVs to the wall, and having a single cable to run hidden in the wall would be awesome.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        14 hours ago

        I wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you’re going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.

        • Anivia@feddit.org
          link
          fedilink
          English
          arrow-up
          10
          ·
          12 hours ago

          Gigantic 80" screens generally draw something like 120W

          In HDR mode they can draw a lot more than that for short peaks

        • IrateAnteater@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          12 hours ago

          Headroom and safety factor. Current screens may draw 120w, but future screens may draw more, and it is much better to be drawing well under the max rated power.

      • mosiacmango@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        13 hours ago

        In wall power cables need to be rated for it to prevent fire risks. This will need to have thick insulation or be made of a fire resistant material.

      • BombOmOm@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        14 hours ago

        Even in that scenario it will complicate the setup. Now your Roku will also have to power your TV? No, any sane setup will have a separate power cable for the TV.

        • IrateAnteater@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          11 hours ago

          I don’t think you’d ever have a peripheral power the tv. The use case I’m envisioning is power and data going to the panel via this single connector from a base box that handles AC conversion, as well as input (from Roku etc) and output (to soundbar etc.). Basically standardizing what some displays are already doing with proprietary connectors.

    • amorpheus@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      11 hours ago

      It would just put extra thermal load on the GPU.

      Passing power through doesn’t have to put noticeable load on the GPU. The main problem I see there is getting even more power to the GPU - Nvidia’s top cards are already at the melting point for their power connector.

      • drspod@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 hours ago

        Passing power through doesn’t have to put noticeable load on the GPU.

        I specifically said thermal load. Power delivery always causes heat dissipation due to I2R losses.

        • amorpheus@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          That’s what I meant. Compared to the power the GPU is actually using, transmission losses for a pass-through should be negligible. If you have a good way to get it to the card in the first place.

    • PeachMan@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      14 hours ago

      The popular use for power delivery through a display cable is charging a laptop from your monitor; it’s already very common with Thunderbolt or USB-4 monitors. But 480W seems a bit overkill for that.

    • jaxxed@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 hours ago

      ~~Why is that better than usb-c? ~~

      Wait… Power the other way. Whoops, I get it.

    • Sizing2673@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      That already kinda allow this and the actual load is pretty small

      Even a big 30 in display is maybe 20 watts

      Well, power delivery goes several times that. Laptops are another very useful case for it. It’s nice to be able to just have a single display port and power connector

      You can do this to an extent, today

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    15 hours ago

    Even an 80” tv only uses around 150W, if my research is correct. Surely this must be thinking about massive displays.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      13 hours ago

      It depends on the voltage used. If they run 48V which seems to be supported by USB-C EPR. Then the cable has to do the same 5A it’s capable of doing today. Then the heat is the same.

      When it comes to their own new connector/cable they can use even higher voltage or more/thicker conductors for power.

      • Bilb!@lem.monster
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 hours ago

        I usually associate it more with Intel since they certify Thunderbolt devices on all the non-Apple hardware and that’s all I use. I forgot Apple had anything to do with it.