Quote:
|
Originally Posted by zeropointbug
Finnster, are you sure you have your understanding or efficiency down?
You said: "But at 6400W @ 64V/100A, the motor could even be same or less eff and put out less heat. Thus you could run a poorer quality motor, but still have a cool system."
Heat output does not have to do with current directly, it's how much power you are putting in, how well the motor transforms that energy, and how much mechanical power is put out at the shaft. It doesn't matter if a motor has 94% eff. at 100v/5A, or 94% at 50v/10A... you will still have the same heat output for a given power input.
The correct assumption is that for a given motor, it will run more efficiently the higher the voltage. And as you up the voltage, the smart thing to do is usually run a higher turn motor (thats with same gearing). The amount of heat that is generated from current will go down proportionally as you increase voltage, and increase turns. This is NOT total heat output however, there are other losses too that contribute.
I think these larger RC cars such as truggies, we need to increase the voltage to say 36V, but that's just IMO. That would be using something like a 16-18 turn XL size motor. Or better yet a Neu, or LMT. Efficiency would go up several % points, and that makes a HUGE difference as far as heat output is concerned.
Zero
|
You know, I was thinking about this on the way home (forgive me, I've had a long crappy day and feeling a bit off.) I made that statement, but either way that leaves ~400W of wasted wattage. "Where would that go?" Then realized that it has nowhere to go
other than as heat. I think I have been too stuck on the resist/amp relationship, treating this as a simple circuit and its effects on the rest of the system, but its more involved than that.
Yes I do see where then 1 or 2% would make a big diff when talking about the large amts of power we are trying to generate. 2% of 1500W is 30W, which is a sig amt of heat to try and dissipate. The eff differences seem more dramatic at "non-ideal" loads, where the eff diff can be several %, so this would be more dramatic in totality.
My thinking was along the lines of: would I prefer a 1500W system on 4S@ 92% eff, or a 1500W system on 6S@90% eff? (just a thought experiment, say comparing a 4s LMT vs a 6S XL) thinking the diff in eff would be more than made up for by the lower current demands. Its now apparent this choice isn't so obvious....