|
03.31.2008, 01:01 PM
This is a good question.
With no load, a motor's current will be its "Io" rating (for the rated voltage of course), which is usually under 5A for all but the extremely low turn motors. So, with no load, the power dissipated is the voltage X this Io rating. For example: if a motor has an Io rating of 2A @ 14v, that's 28w dissipated with no load.
When we put a load on the motor, it pulls more current, so it would stand to reason that more current = higher power.
I've done a little reading on this and there are a couple explanations, each of which doesn't sound quite right.
1) An unloaded motor doesn't direct any of the input power to "work" so all the power is wasted as heat. Load a motor and more power goes to making it move. Kinda makes sense at first glance, but there's something wrong - too simplistic in this theory.
2) A conflicting answer says this isn't true. An unloaded motor heats up because it generally is not moving (or not as fast) so there is less airflow to cool the motor. This makes more sense to me.
I've run an underloaded motor and it does get hotter than a "properly" loaded motor. Is this because the vehicle is not moving as fast to cool the motor with airflow? I would think that the back-EMF factor would have something to do with it as well...
|