Why Compression Ratio Matters


Written by:
Robert X. Perez

One of the critical parameters in compressor design and selection is the compression ratio, often denoted as r. The compression ratio is simply the ratio of the absolute stage discharge pressure to the absolute stage suction pressure.

 

Because most gases increase in temperature when they are compressed, the final compressor outlet temperature is always a concern. A high discharge temperature can lead to the failure of internal components due to material degradation or excessive thermal expansion. Compression ratio is also important in determining required horsepower; the higher the ratio, the greater the required horsepower for that stage.

 

Compression Ratio versus Discharge Temperature

 

Here is a simple example of how to calculate compression ratio. For example, we will compress a gas with a ratio of specific heats of 1.3 (see ratio of specific heats box) from a suction pressure -0.5 psig to a discharge pressure of 35 psig. To calculate the compression ratio, first convert both of these pressures to absolute pressure by adding 14.7 to each term and then dividing the absolute discharge pressure by the absolute suction pressure:



Equation 1

 

Once we know the compression ratio (and assuming there are no internal losses), we can determine the theoretical discharge temperature using Equation 2, which is based on adiabatic compression.







Equation 2

 

Where:

 

T = deg R

 

k = Ratio of specific heat

 

r = Compression ratio calculated by Equation 1.

 

Assuming a suction temperature of 60 deg F, we arrive at a theoretical discharge temperature (Td) of 234 deg F.

 

 

 

 

 

 

 

 

 

 

 

 

 

We will take this exercise a step further by increasing the compressor discharge pressure in 5 psi increments to see what happens to the discharge temperature. Table 1 summarizes the results. As the discharge pressure increases, the compression ratio rises and the discharge temperature (Td) correspondingly increases. In this example, Td increases from 234.3 deg F for a compression ratio of 3.5 to 335.7 deg F for a compression ratio of 6.32.

 















Table 1. The Effect of Discharge Pressure on the Theoretical Discharge Temperature



Design Temperature Margin

Compared to a hypothetical design limit of 275 deg F, we begin to exceed our design limit temperature at a compressor discharge pressure of 50 psig. The relationship between the theoretical discharge temperature and design limit temperature can be seen in Figure 1. It is a good idea to select a conservative design temperature limit during the selection phase of a project to ensure a safe operating margin to take unknown or unexpected internal cylinder losses into account.

 

For example, a potential compressor has a recommended discharge temperature alarm limit of 325 deg F and an automatic shutdown at 350 deg F. If the actual discharge pressure is 60 psig, expect a minimum Td of about 303 deg F. (Remember that the discharge temperature values in Table 1 are theoretical values.) In reality, it will be higher due to internal losses as the compressor experiences normal degradation. If the actual Td is more like 318 deg F, the margin will only be 7 deg F, which is going to lead to countless alarms and midnight phone calls.

 

To avoid this situation, use a conservative design discharge temperature specification and use more compression stages to ensure smaller compression ratios per stage. Table 1 shows that for this example, design compression ratios should not exceed 4.5 per stage to maintain a healthy margin between the operating temperature and alarm limit.













Pages


See also:

Upstream Pumping Solutions

© Copyright Cahaba Media Group 2014. All Rights Reserved. Privacy Policy