Bigger Is Better-Or At Least It Used To Be
Owing partly to tradition, the shafts of electric motors are often larger than those of the equipment they drive. Engineers were very conservative a century ago when electric motors first came into widespread industrial use, so they typically designed in a sizable margin of error. Today's engineers haven't changed much in this respect. For example, standard NEMA frame dimensions, which have been revised only once since 1950, still specify much larger shaft sizes than commonly accepted principles of mechanical engineering would require.
Shaft Design Basics
Shaft size is dictated by torque, not horsepower, but changes in horsepower and speed (RPM) affect torque, as the following equation shows:
Torque (in pound-feet) = HP x 5252/RPM
Accordingly, an increase in horsepower would require more torque, as would a decrease in rpm. For example, a 100-hp motor designed for 900-rpm would require twice as much torque as a 100-hp motor designed for 1,800-rpm. Each shaft must be sized for the torsional load it is expected to carry.
Two basic, quite conservative approaches are used to determine the required minimum shaft size for motors. One method calls for making the shaft large enough (and therefore strong enough) to drive the specified load without breaking, defined by mechanical engineers as the ability to transmit the required torque without exceeding the maximum allowable torsional
shearing stress of the shaft material. In practice, this usually means that the minimum shaft diameter can withstand at least two times the rated torque of the motor.
Another way to design a shaft is to calculate the minimum diameter needed to prevent torsional deflection (twisting) during service, meaning for engineers that the allowable twisting moment, or torque, is a function of the allowable torsional shearing stress (in PSI or kPa) and the polar section modulus (a function of the cross-sectional area of the shaft).
Machinery's Handbook provides the following equations for determining minimum shaft sizes using both design approaches: resistance of torsional deflection and transmission of torque. Both sets of equations are based on standard values for steel, with allowable stresses of 4,000-psi (2.86-kg/mm2) for power-transmitting shafts, and 6,000-psi (4.29-kg/mm2) for line-shafts with sheaves (the proper name for what most of us incorrectly call pulleys). Some of the equations are also specific to keyed or non-keyed shafts-handy for pump users who need to know how to calculate the size of each kind.
The Transmission Of Torque Approach
Most motor shafts are keyed, which increases the shear stress exerted on the shaft. Considering this, motor shaft designs typically use no more than 75 percent of the maximum recommended stress for a non-keyed shaft, which is also why electric motor shafts are often larger than the pump shafts they drive.
Consider a 200-hp (150-kW), 1,800-rpm motor. For a direct-couple application, the standard frame size is 445TS, with a (keyed) shaft diameter of 2.375-in (60-mm). Using Equation , the minimum shaft size would be:
Since the calculated shaft diameter for a 200-hp motor is designed to withstand twice the rated torque, the shaft diameter of 2.371-in is at the absolute minimum for the 400-hp rating.
Resistance To Twisting Method
The other way to calculate minimum shaft size for a motor is to set a limit on the amount of torsional deflection (twisting) that may occur. Resistance to torsional stress is directly proportional to shaft size: the larger the diameter, the greater the resistance to twisting.
A rule of thumb with this method is that the shaft must be large enough that it will not deflect more than 1 degree in a length of 20 times its diameter. To calculate the minimum shaft size to meet this specification, use the following equation:
For the 200-hp (150-kW), 1,800-rpm motor from Example 1, the minimum shaft size to limit torsional deflection would be: