• 0 Posts
  • 21 Comments
Joined 1 year ago
cake
Cake day: July 11th, 2023

help-circle



  • To clarify; I have a 100W Ugreen Nexode 4 Port USB Charger that I use to charge my laptop (~60W), Steam Deck (~40W), iPhone (~20W) and AirPods (~5?W).

    The problem is if my original product cable has gone walkabout temporarily and I need to use a random one to stand in - there is no clear way of telling if I’m accidentally using a 5W-max cheap cable to try and keep my laptop charged while working.

    Obviously there are some context clues depending on cable thickness etc., but with how common cosmetic braiding is becoming a thing - even that’s getting harder to rely on.


  • Yes, you can. The charger and the device communicate between one another what they can support, and pick the highest one they both agree on.

    E.G. my laptop charger can charge at full speed (100W) for my MacBook, but only at 20W for my iPhone.

    That bit is pretty straightforward and transparent to end users (there are a few rare conditions where devices might not agree on the fastest, and have to fall back to a slower one); the issue is more with cables not having sufficient gauge wire, or missing connections that prevent the charger and device from communicating their full functionality.


  • It’s been more of a pain in the arse than initially expected.

    Most motherboards (for example) only have 2-4 USB-C ports, meaning that I still need to employ A-C and C-C cables for peripherals etc.

    My main gripe is that the standard just tries to do too many things without clear delineation/markings:

    1. Is it a USB 2.0 (480Mbit), 5Gbit, 10Gbit or 20Gbit cable? Can’t really tell from the plug alone.

    2. More importantly, for charging devices: How the heck do I determine maximum wattage I can run?

    For all its faults, at least the blue colour of a USB-3.0 plug (or additional connectors for B/Micro) made it easy to differentiate !

    Now I’m eyeing up a USB Cable tester just to validate and catalogue my growing collection! 🤦🏻‍♂️





  • thatKamGuy@sh.itjust.workstomemes@lemmy.worldWe're sorry...
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 months ago

    Going by that argument though, then EVERYTHING is an indirect act of God.

    Bullet wound? Clearly it was God’s will, for ordering the universe in such a way that an individual was armed at that point in time to cause you harm.

    Cancer? God willed the carcinoma onto your skin.

    Maybe it’s just Argumentum ad absurdum, but insurance companies are basically arguing against their own existence.


  • I specified one generation of hardware backwards compatibility; beyond that software emulation would be more than sufficient.

    The PS5 is backwards compatible with all but ~6 PS4 titles. Sure that’s entirely because of the shared x86-64 architecture, but it makes the PS4 stand out like a sore thumb for its lack of direct generational backwards compatibility.

    By the end of the PS3’s lifecycle the Cell processor has been die-shrunk multiple times, reducing power consumption, heat output and PCB space required. It could then share the rest of the PS4s existing IO chips and circuitry.

    There was literally no reason for backwards compatibility to be removed beyond corporate greed. Blindly accepting it, and actually trying to justify that as a good thing is one of the key reasons this hobby has gone down the toilet.




  • No doubt, the 360 had the PS3’s number earlier on - due in no small part to the lack of documentation for the Cell architecture making it much harder to program for, let alone optimise.

    SCE America I think was credited with the mid-cycle turnaround thanks to a lot of Western-developed exclusives (Naughty Dog were a real MVP), which is why the PlayStation identity seems to have largely switched from Japanese to American from the launch of the PS4.

    I’m a bit of a tech hoarder, and still own my original PSP, PS1, PS2 and PS3s… so luckily my first-born is at no risk just yet. 😅