Regularly someone suggests that it's better to wire a 120/240 volt table saw motor to 240v. The responses generally fall into two groups, the first pointing out that motor performance is exactly the same running a 120/240 volt motor at whatever voltage it's configured for (this is correct, and tends to be the group I fall into) and the opposing group of people who've swapped saws from 120 to 240 and then observed a noticeable improvement in performance.
My theory:
That when operated at design voltage, there is no advantage to operating a 120/240 motor at 240v.
That any difference can be traced directly to voltage drop.
That, for motors of 1.5hp and above, it can be very difficult to reduce the voltage drop to an insignificant percentage when run at the lower voltage.
That existing 120v circuits are much more likely to have voltage reducing factors than 240v circuits.
That in actual use saw motors will tend to perform better when wired to 240v than when wired to 120v.
So, now lets test out some these theories, starting with "there is no advantage to operating a 120/240 motor at 240v".
We will use ordinary A lamps to illustrate what is going on in the motor windings. We take two 120v 43 watt light bulbs each producing, along with the 43 watts of heat, 680 lumens of visible light. They are connected in the conventional way, each lamp directly to 120v.
Untitled by
The result isn't a surprise:
[url=https://flic.kr/p/L13xrn]Untitled by
note the amps, there are .69 amps producing a total of 86 watts of heat (and some light).
Now, lets plug the two lamps into the 240v circuit. To keep it from burning up both lamps we will rearrange the wires. Just like the saw if we don't connect the wires differently we will burn up the lamps (the motor) very quickly.
Here we are:
[url=https://flic.kr/p/Kbnah4]Untitled by
Now the light is the same, the watts (heat) is the same and the amps in that wire have dropped to .34, but there are also .34 amps in the other wire, for a total of 6.8 which is the same as the 120v circuit (once you correct the rounding error). So the light, the heat, and the amps, all the same.
A motor will act exactly the same as these two lamps, it does not care what voltage circuit you ran into it's splice box. Each lamp (winding) get 120v either way.
What's the difference to a woodworker, and why the endless debate? The voltage drop.
Lets compare a pretty typical situation. You move into your new house, set up a shop in the garage, plug your 120v saw into a convenient 120v outlet. Are there other loads on that circuit? Maybe. Does that circuit run through other outlets, perhaps push-wire outlets that don't pass the current through as well as they should, adding to voltage drop? Probably. Aluminum wire? #14 wire somewhere in the circuit? Not usually, but sometimes.
Lets ignore those factors for now and focus on distance. Assuming 75 feet of distance from the saw to the breaker, a 1.5hp saw pulling 20 amps (that's the number the NEC says to use unless have the motor and can look at it) with #12 copper wire installed and perfect connections, the 120v at the panel will be reduced 4% to 115 volts at full rated load. Longer distances, less than perfect connections, any load sharing, it just gets worse.
Next, having decided to take that excellent advice you got from the forum at woodworkingtalk.com, you run a 240v circuit to your saw. Are there other loads on that circuit? No. Push-wire outlets? No. Perfect connections? Probably.
Again we ignore those factors, the big difference is that where before we had 6.8 amps traveling 75 feet in #12 wire now we have 3.4 amps. Those electrons slip through the wire much easier because there are only 1/2 as many of them, so what started out as 240v is now 1% less, 238 volts. Each saw winding gets 119 volts.
This is best-case and I would expect to see larger, sometimes much larger differences is actual installations.
Opinion:
In real world conditions wiring a saw motor to 240v will provide better performance in theory, while the same motor wired to 120v will also be more likely to encounter factors that tend to reduce the voltage and therefore the performance.
I welcome any thoughts especially from those who disagree. And, thanks for taking the time to read this.
My theory:
That when operated at design voltage, there is no advantage to operating a 120/240 motor at 240v.
That any difference can be traced directly to voltage drop.
That, for motors of 1.5hp and above, it can be very difficult to reduce the voltage drop to an insignificant percentage when run at the lower voltage.
That existing 120v circuits are much more likely to have voltage reducing factors than 240v circuits.
That in actual use saw motors will tend to perform better when wired to 240v than when wired to 120v.
So, now lets test out some these theories, starting with "there is no advantage to operating a 120/240 motor at 240v".
We will use ordinary A lamps to illustrate what is going on in the motor windings. We take two 120v 43 watt light bulbs each producing, along with the 43 watts of heat, 680 lumens of visible light. They are connected in the conventional way, each lamp directly to 120v.
Untitled by
The result isn't a surprise:
[url=https://flic.kr/p/L13xrn]Untitled by
note the amps, there are .69 amps producing a total of 86 watts of heat (and some light).
Now, lets plug the two lamps into the 240v circuit. To keep it from burning up both lamps we will rearrange the wires. Just like the saw if we don't connect the wires differently we will burn up the lamps (the motor) very quickly.
Here we are:
[url=https://flic.kr/p/Kbnah4]Untitled by
Now the light is the same, the watts (heat) is the same and the amps in that wire have dropped to .34, but there are also .34 amps in the other wire, for a total of 6.8 which is the same as the 120v circuit (once you correct the rounding error). So the light, the heat, and the amps, all the same.
A motor will act exactly the same as these two lamps, it does not care what voltage circuit you ran into it's splice box. Each lamp (winding) get 120v either way.
What's the difference to a woodworker, and why the endless debate? The voltage drop.
Lets compare a pretty typical situation. You move into your new house, set up a shop in the garage, plug your 120v saw into a convenient 120v outlet. Are there other loads on that circuit? Maybe. Does that circuit run through other outlets, perhaps push-wire outlets that don't pass the current through as well as they should, adding to voltage drop? Probably. Aluminum wire? #14 wire somewhere in the circuit? Not usually, but sometimes.
Lets ignore those factors for now and focus on distance. Assuming 75 feet of distance from the saw to the breaker, a 1.5hp saw pulling 20 amps (that's the number the NEC says to use unless have the motor and can look at it) with #12 copper wire installed and perfect connections, the 120v at the panel will be reduced 4% to 115 volts at full rated load. Longer distances, less than perfect connections, any load sharing, it just gets worse.
Next, having decided to take that excellent advice you got from the forum at woodworkingtalk.com, you run a 240v circuit to your saw. Are there other loads on that circuit? No. Push-wire outlets? No. Perfect connections? Probably.
Again we ignore those factors, the big difference is that where before we had 6.8 amps traveling 75 feet in #12 wire now we have 3.4 amps. Those electrons slip through the wire much easier because there are only 1/2 as many of them, so what started out as 240v is now 1% less, 238 volts. Each saw winding gets 119 volts.
This is best-case and I would expect to see larger, sometimes much larger differences is actual installations.
Opinion:
In real world conditions wiring a saw motor to 240v will provide better performance in theory, while the same motor wired to 120v will also be more likely to encounter factors that tend to reduce the voltage and therefore the performance.
I welcome any thoughts especially from those who disagree. And, thanks for taking the time to read this.