View Single Post
Old
  (#17)
BrianG
RC-Monster Admin
 
BrianG's Avatar
 
Offline
Posts: 14,609
Join Date: Nov 2005
Location: Des Moines, IA
11.19.2006, 12:15 AM

This thread seems a bit confusing. I agree with GD for the most part. Assuming the motor impedance is constant and the load is constant, an increase in voltage will increase current.

A motor is not a "constant power" device, so you can't say a motor is going to put out, say, 1800 watts no matter what voltage is applied. This is painfully obvious from the runtime achieved. If a motor always pulled 1800 watts, batteries would last a VERY short time. For a given voltage, it will draw the current it needs to do the job. The product of the voltage and the current creates the power. Finding amperage by dividing a constant wattage by the applied voltage is not correct, sorry.

However, motor RPM and inductance does have a factor in this though. Higher rpms can increase the back EMF, which is comparable to increasing the resistance, which decreases current. So, the "resistance" is not linear like in a pure resistor. So, an increase in voltage will increase rpms, which will increase the back EMF, which increases impedance somewhat, and decrease resistance, but not as much as the conversation here seems to imply.

Example, a 10v, a motor draws 10A. This is 100W. This equates to a 1 ohm "resistance".
However, at 20v, the same motor with the same load may draw "only" 18A. This is 360W. The impedance increased to 1.1ohms. A pure resistor would have stayed at 1 ohm, which would have developed 20A (and 400W). The rpms, and the increased back EMF they created, made the resistance go up a little. But it certainly did not go up enough to generate the same 100W.

Last edited by BrianG; 11.19.2006 at 12:17 AM.
  Send a message via Yahoo to BrianG Send a message via MSN to BrianG  
Reply With Quote