broken 4.8vdc power adaptor. is it better to replace with a 5v 1 a or a 5v 2a?

I've got a 2nd generation Echo Dot that comes with a 1.8A power adapter and a USB cable. I've played around with different power adapters and it doesn't seem to be an issue using something else. I don't know if maybe it's possible to overload an underpowered adapter, but I haven't seen it happen.
They just drop in voltage and brown out.
 
Usually the charge circuits on these are reasonably complex when there's a li-po battery involved (which there appears to be) so the risk of overdriving the battery with a higher current power source shouldn't exist. The charge system will only take the amount of current it's setup to use to charge the battery, so the extra headroom afforded by the higher amperage charger is a good thing. On the other hand, if it tries to draw more from the charger than the charger can give (the lower amperage charger) it may nuke the wall wart.

How do you know the battery is Li-po? Your point about over-driving the wall wart is possible. I don’t think it would nuke it, just like you guys are probably right that a half amp more capacity won’t over-drive it; that the charge circuitry is in the trimmer. In this case, only a much larger supply VOLTAGE would potentially cause issues.

I've got a 2nd generation Echo Dot that comes with a 1.8A power adapter and a USB cable. I've played around with different power adapters and it doesn't seem to be an issue using something else. I don't know if maybe it's possible to overload an underpowered adapter, but I haven't seen it happen.

Going with a larger capacity (amperage) wart with the same voltage won’t affect the performance of the device. I’ve used various 5VDC power supplies in the past to power my old 2nd gen Echo Dots when I lost the PS. IIRC, doesn’t that wart also run at some weird voltage like 5.2VDC? The old FireTV sticks were able to be powered by my TV’s 1A output; however, the latest 4K FireTV’s require more juice and won’t work with the 1A TV port. The latest Echo Dots come with hard-wired wall warts (3rd gen Dots are 15W, 12V/1.25A)

This is all sort of irrelevant to the subject, because a 5V/1A wart will either provide 1A to the OP’s clippers (straight battery charging) or it will attempt to draw 1.5.

One other variable: what does this thing draw When you try to trim while plugged in. My latest battery-powered trimmers don’t have long enough cords for that, but will turn on when while plugged in. My older trimmers had long cords and, with aging batteries, you could tell a huge difference in the speed of the blades plugged in vs just on battery, so it was definitely using more than just battery to power itself while plugged in.
 
When the charger is plugged in, more juice is flowing ... and especially if the batteries are weak and not holding charge well, you can feel it.

In general you don't want your supply to be underrated. If your device was originally rated @ 1.5A and you now have a 1A and a 2A charger available, I would go with the 2A.
 
Going with a larger capacity (amperage) wart with the same voltage won’t affect the performance of the device. I’ve used various 5VDC power supplies in the past to power my old 2nd gen Echo Dots when I lost the PS. IIRC, doesn’t that wart also run at some weird voltage like 5.2VDC? The old FireTV sticks were able to be powered by my TV’s 1A output; however, the latest 4K FireTV’s require more juice and won’t work with the 1A TV port. The latest Echo Dots come with hard-wired wall warts (3rd gen Dots are 15W, 12V/1.25A)

This is all sort of irrelevant to the subject, because a 5V/1A wart will either provide 1A to the OP’s clippers (straight battery charging) or it will attempt to draw 1.5.

Yeah - that's basically Ohm's law. V = I x R. If the voltage is well regulated, the device's effective resistance/impedance will determine how much current it draws. I guess it would be like a hole in a tank. How fast it flows depends on how big the hole is and not how much water there is in the tank.

Yeah the power adapter that comes with it says 5.2V, but that's not terribly important to proper functionality. The bigger Apple power adapters for iPads say 5.2V, but they're easily compatible with iPhones or any other device that comes with a 5.0V power adapter. I've powered that Echo Dot off of 5.0V/1A rated power adapter. Once I tried a USB power pack just to see if I could make it mobile, and it worked just fine.

In general I would think it's just a power supply. However, there's been these weird setups where anything other the exact type of power adapter ran a serious risk of overheating, like the rash of vaping equipment fires. My guess is that they had very rudimentary circuitry inside the device, and the power adapter was needed to create a proper circuit.
 
We maybe getting into "oil is oil" or "oil is not oil" I mean "charger is charger" territory ... as long as the adapter meets the spec. you will be fine! :alien: I don't think 0.5A at 5 volts (2.5Watts delta) is going to cause major issues.
 
We maybe getting into "oil is oil" or "oil is not oil" I mean "charger is charger" territory ... as long as the adapter meets the spec. you will be fine! :alien: I don't think 0.5A at 5 volts (2.5Watts delta) is going to cause major issues.

My only worry would be that it's some sort of specialty circuit. For example my 12V 1A car charger that reads out at about 17V when open but goes way down when connected to a lead acid battery.
 
I can only speak from my experience with LDO and SMPS circuit tests / measurement in the past (which is very little).

In theory if your power supply is rated a certain current it is guaranteed to stay at the rated voltage for that amount of current draw or less. In other word if you have a 4.8V 1.5A supply you can draw up to 1.5A and it will stay at 4.8V and not drop below, or if you have 5.2V 1.7A you will be able sustain 5.2V when you draw up to 1.7A. Beyond that you will likely get a voltage drop, or the device stop providing voltage completely.

Why would companies use these awkward voltages instead of standardize to 5V? My guess is they are confident their manufacturing will get to pretty close to the rated voltage (i.e. 4.8V precision from 4.750V to 4.850V, 5.2V precision from 5.150V to 5.250V), and they are still within the 5% tolerance most devices are designed for, so you won't nuke another device accidentally plug into it. They increase and decrease the voltage by a little to work around some design problem, like charge time limit or current draw limit. They know it won't fry anything but they get what they want and this is the cheapest way to do it.
 
My only worry would be that it's some sort of specialty circuit. For example my 12V 1A car charger that reads out at about 17V when open but goes way down when connected to a lead acid battery.
This could happen in analog voltage source, but for most switching digital circuit they should not do this, at least USB they shouldn't. This would be a brown out and the circuit should reset itself.
 
This could happen in analog voltage source, but for most switching digital circuit they should not do this, at least USB they shouldn't. This would be a brown out and the circuit should reset itself.

Sure. However, I noted all those weird circuits that just reused common connectors even though they didn't operate properly on a regulated power supply. It wasn't just vaping devices but hoverboards.

I've seen some really weird charging circuits over the years. Once I bought my kid this racing set with battery powered cars that went around a banked plastic track. The cars themselves were charged with a 4-AA battery pack with a plug for a few minutes to charge up their tiny batteries. I also don't think it was anything more sophisticated than electrically connecting the batteries together like trying to charge a dead car battery by connecting a charged battery. Some people have been able to start a car after leaving it like that long enough, even with the charged battery removed. There are some emergency backups that plug into the lighter adapter where you just wait long enough for the car battery to get enough charge from the device.
 
Anything designed for USB would probably have to deal with less than maximum input current. For example, my USB power packs can just charge slower if I plug them into a standard 0.5A port but of course charge faster if I use a 2.1A plug-in supply.

4.8V sounds really odd though.
Yes, but it depends on whether the product attempts to run from the PSU while simultaneously charging, or rather if it can, whether the owner needs that feature.

The charging I was not concerned about. It was whether the motor was going to be overdriven at up to 5.6V, but of course if both PSU regulated to 5.0V, and one 1A and the other 2A, there would be no reason not to choose the 2A to replace a 1.5A original.
 
Last edited:
IMO, lower is better when it comes to battery charging. Charging his battery is not like running a CPU-powered component and I’m assuming the OP won’t be using this thing while it’s charging, so why tax and possibly damage the charging circuit with too of a voltage/current?

The 1A charger either will or will not work.

@Dave9 No offense, but you’re making all of this way too complicated. If you couldn’t answer all of your questions based on the information in the OP (and asking the OP what kind of gadget he has - not sure why this hasn’t been asked yet) then what makes you think he can figure it out?

Put a diode in series to drop the voltage? For a trimmer? Are you serious?
Lower isn't always better. NiCd an NiMH for example, require more than a minimum charge current (relative to their capacity) for a smart charge circuit to dependably sense Delta -V. In theory it is gentler to the cells to charge at a lower rate but in practice that requires actively monitoring the voltage and manually terminating it since a smart function can't operate that low, nor could a timer based charger do anything more than run the full set time period regardless of how much charge was needed. This factor is one of the reasons that the *dumb* trickle chargers for NiCd and NiMH gave users bad experiences of shorter than expected cell life instead of approaching 1000 or more recharge cycles. Well, that and no low voltage cutoff in the battery pack or equipment.

This is more significant with the consumer AA and AAA cells which can't stand much current, while the typical 18650 cell is now 2Ah or more and capable of 10A or more, so no real issue charging it at 1A or so. Plus as Overkill already mentioned, unlike NiMH or NiCd, Li-Ion absolutely requires a charge termination circuit to not cause damage or rather, damage would happen to NiCd or NIMH too but the consequences are far less substantial, merely ruining the cell rather than risking a cascading fire.

Putting a diode in series... heh you seem to think this is excessive while it is a trivial, quick and cheap thing to do. More advanced would be hack the PSU to drop voltage, or use a higher voltage PSU to buck regulate to 4.8V, or reverse engineer the entire thing because ultimately, there are different ways to skin a cat and different cats. Back to what I'd mentioned previously, it could matter whether the motor can run direct from the input voltage or only isolated to a battery only supply, fed by a 4.2V max charge circuit.

It is quite possible that any particular person might not be able to use all info provided. Maybe some of it is useful or perhaps not, but that's the nature of things - people ask what they don't know, and whether they understand the answer, as well as someone who already knew.. chicken and egg scenario.

There is no short answer that is accurate given limited info. As products continue to integrate more specialized ICs, to reduce component count, size, and cost, it becomes harder to generalize the behavior of one device based on some other, and yet I did generalize a bit like assuming this clipper has a brushed rotary motor which isn't the case for a significant % of mains powered clippers.
 
Last edited:
How do you know the battery is Li-po? Your point about over-driving the wall wart is possible. I don’t think it would nuke it, just like you guys are probably right that a half amp more capacity won’t over-drive it; that the charge circuitry is in the trimmer. In this case, only a much larger supply VOLTAGE would potentially cause issues.

This post from the OP:
OP said:
i opened it up and found a
ICR 18650 2000 MAH 3.7V 18KPH20 battery inside. i assume this is a lithium ion battery
with my brief research on lithium ion batteries it seems to be that it needs 4.2v to charge however i am not sure why the power adaptor that came with it was a 4.8v 1.5a. it looks to be a regular power adaptor. or is it possible that the small circuit board inside the hair clipper deals with the charging and voltage?

it got me thinking, does that mean the original power adaptor is actually one thats made to charge lithium ion batteries? if so, cant i just find a old device that had a 3.7v lion battery and use that charger?

Suggests that he thinks it is Li-Po.

I've nuked a wall wart that was below spec for an app before, but typically, per @PandaBear's example you just get unusual performance. With a router or switch it might not boot, might boot but not pass traffic, boot loop....etc.

I wouldn't be wanting to under-feed a smart charge circuit for a lithium battery and get unpredictable performance from it.
 
I've nuked a wall wart that was below spec for an app before, but typically, per @PandaBear's example you just get unusual performance. With a router or switch it might not boot, might boot but not pass traffic, boot loop....etc.

How'd you manage to do that? A well designed wall wart will just stop at its limit and let the device deal with insufficient current. I've charged plenty of devices with a 1A (or even lower rated) power adapter and none of them died. Some of them were rather generic.
 
How'd you manage to do that? A well designed wall wart will just stop at its limit and let the device deal with insufficient current. I've charged plenty of devices with a 1A (or even lower rated) power adapter and none of them died. Some of them were rather generic.

I don't think it was well designed is my guess. The ASUS WL-500g was notorious for the OEM AC adapters failing. IIRC, it was a 12V 2A application. I had a 12V 1.5A adapter kicking around, not sure what from, and used it in the stead of the existing adapter that had died. It got quite hot and eventually failed. I managed to find another 12V 2A adapter later on, think it was from a Linksys, and it worked fine.
 
I don't think it was well designed is my guess. The ASUS WL-500g was notorious for the OEM AC adapters failing. IIRC, it was a 12V 2A application. I had a 12V 1.5A adapter kicking around, not sure what from, and used it in the stead of the existing adapter that had died. It got quite hot and eventually failed. I managed to find another 12V 2A adapter later on, think it was from a Linksys, and it worked fine.

I depends on who makes the device. It's not really all that difficult. Most would just buy an off the shelf power supply IC from ST or Texas Instruments and the required components. Even the big names in power supplies like Salcomp or Flextronics use someone else's ICs. I suppose cheaping out on the components might be risky.

Most of my 12V supplies are made by some generic company in China. I got a bunch of those years ago with external hard drives. I just stored the old hard drives and the power supplies are more or less mix and match. I had a dead 12V/1A power adapter for a Maha AA battery charger. It had a tiny barrel port, but I was able to find adapters on eBay for less than $2. Of course these 2A power supplies did just fine.
 
I depends on who makes the device. It's not really all that difficult. Most would just buy an off the shelf power supply IC from ST or Texas Instruments and the required components. Even the big names in power supplies like Salcomp or Flextronics use someone else's ICs. I suppose cheaping out on the components might be risky.

Most of my 12V supplies are made by some generic company in China. I got a bunch of those years ago with external hard drives. I just stored the old hard drives and the power supplies are more or less mix and match. I had a dead 12V/1A power adapter for a Maha AA battery charger. It had a tiny barrel port, but I was able to find adapters on eBay for less than $2. Of course these 2A power supplies did just fine.

Yup, that's consistent with the majority of my experience, I believe I only ever nuked the one, but then most of the time I aim for the same or higher amperage.
 
This post from the OP:


Suggests that he thinks it is Li-Po.

I've nuked a wall wart that was below spec for an app before, but typically, per @PandaBear's example you just get unusual performance. With a router or switch it might not boot, might boot but not pass traffic, boot loop....etc.

I wouldn't be wanting to under-feed a smart charge circuit for a lithium battery and get unpredictable performance from it.

I didn‘t Google 18KPH20. I just assumed generic l, cheap 18650s are usually the cheaper li-ion.

I‘ll concede and defer to you smarter guys on over or under-amping based on wall wart rating.
 
I didn‘t Google 18KPH20. I just assumed generic l, cheap 18650s are usually the cheaper li-ion.

I‘ll concede and defer to you smarter guys on over or under-amping based on wall wart rating.

It's not that hard to employ best practices. It's not even that expensive. However, I could imagine in a cost-conscious marketplace there are some who will cut corners to save a few pennies here and there.

Best practices would be to not use a common connector to connect to a non-standard setup, like those vaping devices I mentioned earlier. The idea is have a complete charging circuit inside the device, and to use an ideal voltage source as a power supply. Obviously that doesn't always happen. I've seen some devices that I'm sure are using good practices, but they say to only use the power supply provided with the device. Makes it really tough when that power supply stops working.
 
Back
Top