• zero_iq@lemm.ee
    link
    fedilink
    arrow-up
    25
    ·
    edit-2
    1 year ago

    In this thread: people who don’t understand what power is.

    Power isn’t something that is “pushed” into a device by a charger. Power is the rate at which a device uses energy. Power is “consumed” by the device, and the wattage rating on the charger is a simply how much it can supply, which is determined by how much current it can handle at its output voltage. A device only draws the power it needs to operate, and this may go up or down depending on what it’s doing, e.g. whether your screen is on or off.

    As long as the voltage is correct, you could hook your phone up to a 1000W power supply and it will be absolutely fine. This is why everything’s OK when you plug devices into your gaming PC with a 1000W power supply, or why you can swap out a power-hungry video card for a low-power one, and the power supply won’t fry your PC. All that extra power capability simply goes unused if it isn’t called for.

    The “pushing force” that is scaled up or down is voltage. USB chargers advertise their capabilities, or a power delivery protocol is used to negotiate voltages, so the device can choose to draw more current and thus power from the charger, as its sees fit. (If the device tries to draw too much, a poorly-designed charger may fail, and in turn this could expose the device to inappropriate voltages and currents being passed on, damaging both devices. Well designed chargers have protections to prevent this, even in the event of failure. Cheap crappy chargers often don’t.)