IMO, lower is better when it comes to battery charging. Charging his battery is not like running a CPU-powered component and I’m assuming the OP won’t be using this thing while it’s charging, so why tax and possibly damage the charging circuit with too of a voltage/current?
The 1A charger either will or will not work.
@Dave9 No offense, but you’re making all of this way too complicated. If you couldn’t answer all of your questions based on the information in the OP (and asking the OP what kind of gadget he has - not sure why this hasn’t been asked yet) then what makes you think he can figure it out?
Put a diode in series to drop the voltage? For a trimmer? Are you serious?
Lower isn't always better. NiCd an NiMH for example, require more than a minimum charge current (relative to their capacity) for a smart charge circuit to dependably sense Delta -V. In theory it is gentler to the cells to charge at a lower rate but in practice that requires actively monitoring the voltage and manually terminating it since a smart function can't operate that low, nor could a timer based charger do anything more than run the full set time period regardless of how much charge was needed. This factor is one of the reasons that the *dumb* trickle chargers for NiCd and NiMH gave users bad experiences of shorter than expected cell life instead of approaching 1000 or more recharge cycles. Well, that and no low voltage cutoff in the battery pack or equipment.
This is more significant with the consumer AA and AAA cells which can't stand much current, while the typical 18650 cell is now 2Ah or more and capable of 10A or more, so no real issue charging it at 1A or so. Plus as Overkill already mentioned, unlike NiMH or NiCd, Li-Ion absolutely requires a charge termination circuit to not cause damage or rather, damage would happen to NiCd or NIMH too but the consequences are far less substantial, merely ruining the cell rather than risking a cascading fire.
Putting a diode in series... heh you seem to think this is excessive while it is a trivial, quick and cheap thing to do. More advanced would be hack the PSU to drop voltage, or use a higher voltage PSU to buck regulate to 4.8V, or reverse engineer the entire thing because ultimately, there are different ways to skin a cat and different cats. Back to what I'd mentioned previously, it could matter whether the motor can run direct from the input voltage or only isolated to a battery only supply, fed by a 4.2V max charge circuit.
It is quite possible that any particular person might not be able to use all info provided. Maybe some of it is useful or perhaps not, but that's the nature of things - people ask what they don't know, and whether they understand the answer, as well as someone who already knew.. chicken and egg scenario.
There is no short answer that is accurate given limited info. As products continue to integrate more specialized ICs, to reduce component count, size, and cost, it becomes harder to generalize the behavior of one device based on some other, and yet I did generalize a bit like assuming this clipper has a brushed rotary motor which isn't the case for a significant % of mains powered clippers.