Jump to content
Create New...

Recommended Posts

Posted (edited)

The entire notion of turbocharging, big power, boost and compression warrants a discussion. Recently, the following question had been posted on a different site and thread...

"The CLA45 AMG runs 8.6:1 Compression and 26 psi of boost. 26 psi of boost is 40.7 psi of absolute pressure. If you are squeezing 8.6 parts of that into 1 won't you have and effective compression of (14.7+26) / 14.7 x 8.6 = 23.8:1? How can any engine run on 23.8:1 of compression without blowing up?"

The responses range from the effective compression is not 23.8:1, that somehow you must square root the 40.7 before multiplying it by 8.6, to some other form of alternative fuzzy math to justify the stuff not blowing up. None of these are true or accurate. The truth is that... yes, you in fact have a 23.8:1 compression in terms of cylinder pressures prior to ignition. And, yes, it's perfectly alright!!! Forget any notion you may have that a certain compression will detonate or pre-ignite. It doesn't work like that... if you need proof, just ask yourself this... If you have fuel and air at 3000 psi at 70 deg F do you think they'll somehow burn? Heck no!

The ideal gas law (PV=nRT) states that for a given amount of molecues (n) in an enclosed space (V), if Volume(V) goes down, then Pressure (P) goes up and temperatures (T) increases correspondingly because n and R are both constant in this case. At a certain Temperature (T) the mixture of fuel and air goes bang without a spark lighting it because it is hot enough to start burning. Typically, with modern engines on premium gas, you reach the critical temperature (T) when you squeeze air and fuel from about 12~13 parts into one part. That's why you never see street engines with compressions above 12.5:1 or so and even 12:1 is pretty exotic. This is also why Direct Injection engine can run about 1~1.5 points more compression -- because the atomization of fuel in the cylinders cools things down a little! Water injection? Same thing. The important thing about PV=nRT in terms of pre-ignition is that ONLY TEMPERATURE (T) matters. The other numbers and the way they change are only relevant as so far as they affect temperature.

Now... the key here is that turbocharged engines with a 23.8:1 effective compression does not do all of that compression in the cylinders only does 8.6:1 in the cylinders, but feeds the cylinders with air that has been compressed by the turbo at a ratio of about 2.77:1. This is crucial to why they don't blow up!

First of all, if you start with higher pressure air that is 2.77 times atmospheric pressure and at room temperature. Apply PV=nRT. Now... Pressure (P) starts higher and amount of air molecues in the cylinder (n) starts proportionally higher. R is a constant. V and T can be the same as before without boosted air! Think about it like this... you can have compressed air in your tires at 70 degrees F, and you can also grab the same volume of unpressurized air in your room and have it at 70 degrees F. In otherwords, as long as the air the same temperature, if you can run 12:1 compression on 26 psi of boost (40.7 psi absolute pressure) just as easily as you can run 12:1 compression with 14.7 psi of atmospheric air at sea level! Go to a planet with with same atmospheric composition as Earth and the same climate, but where the air is twice, or three times as dense. Your 12:1 engine will run fine and make a lot of power!

The only reason that engine is running a reduced 8.6:1 compression is that air force fed from the turbo is NOT at room temperature when it gets to the cylinders. It is hotter. And, because it is hotter you can compress them less in the cylinders before they reach the critical temperature when they ignite. It is hotter because centrifugal compressions are not 100% efficient. They are at best 70~75% efficient. Which means that, at best only 70~75% is turned into extra density the rest is turned into heat. Now, with an intercooler, you lower that temperature somewhat, but intercoolers are also only about 70% efficient at best. So you still end up with hotter air just not as ridiculously hot as before. In theory, if your turbo compressor is 100% efficient, you won't need an intercooler and you'll be able to run 100 psi of boost with the same compression as in your NA engine (say 12:1). In theory, if your intercooler is 100% efficient, you can also run infinite boost on the same NA compression until your cylinder walls, your turbos overspeed to death or rods give way due to the increased combustion pressures and power! But because the turbo is about 68% thermally efficient and the Intercooler is about 70% efficient, you end up with denser air that is also about 30% or 30% or about 9% hotter. Reducing the compression by 9% will make up for that. As it turns out a little more reduction is needed because the AMG turbo and IC are probably not even 70% efficient. The drop in compression ratio to 8.6:1 is only to compensate for the heating caused by the turbocharging system. It is not to compensate for increased density; you never need to compensate for increased density.

The simple answer to the above question is hence simply that, going down to 8.6:1 compression is sufficient to compensate for the heating from the turbos and the failure of the intercooler to remove all of the heat created, such that the total heating before the spark fires is about the same as the naturally aspirated 2.0 engine.

Edited by dwightlooi
Posted

But because the turbo is about 68% thermally efficient and the Intercooler is about 70% efficient, you end up with denser air that is also about 30% or 30% or about 9% hotter. Reducing the compression by 9% will make up for that. As it turns out a little more reduction is needed because the AMG turbo and IC are probably not even 70% efficient. The drop in compression ratio to 8.6:1 is only to compensate for the heating caused by the turbocharging system. It is not to compensate for increased density; you never need to compensate for increased density.

Sorry, I meant to say that for a 2.77 times increase in pressure, there is about a 30/70 = 42% increase in temperatures. The intercooler is only 70% efficient at removing this heat. Hence, about a 12.6% reduction in compression would make up for the temperature increase in a system that is pretty efficent -- 70% compressor efficiency + 70% intercooler efficiency. Based on such idealistic assumptions going from 11:1 to 9.6:1 compression would suffice. But, I don't believe the CLA45's engine is operating with those efficiencies across the entire range of operating conditions, hence a greater reduction is necessary and 8.6:1 sounds reasonable.

Posted

This is very interesting information and helps me better understand the turbo system.

Question of my own. We know that Inter-coolers and Turbo's are not 100% efficient, if we could chill the intake system going into the engine so that once the air goes through the inter-cooler, into the turbo and as it goes towards the cylinders we could chill the passage way to take heat away would this not help us achieve closer to 100% efficiency?

Posted (edited)

This is very interesting information and helps me better understand the turbo system.

Question of my own. We know that Inter-coolers and Turbo's are not 100% efficient, if we could chill the intake system going into the engine so that once the air goes through the inter-cooler, into the turbo and as it goes towards the cylinders we could chill the passage way to take heat away would this not help us achieve closer to 100% efficiency?

Actually, it will. One of the reasons traditional intercoolers cannot ever be any where near 100% efficient is that it relies on ambient airflow to either cool the charge directly via a heat exchanger, or it relies on it to cool water which is then used to cool the intake charge via a second heat exchanger. The problem with that is that you can never get to ambient temperature. It's like using a fan on a frying pan on the stove. You can never get the frying pan to room temperature although you will cool it down a bit.As you get closer and closer to room temperatue the amount of heat transfer plummets until it is essentially zero.

The only way to cool the air charge back to room temperature or even take it below that is to have a coolant that is below room temperature. Drag racers pack ice slush around their ICs to do that but it only works for a minute or two of running. Theoretically you can also use a refrigerator on your IC. The problem with refrigerator is that they move heat slowly. Notice that it takes an air conditioner 10~15 minutes to cool the cabin or room down in summer. Basically, it doesn't move a lot of heat a minute, but it keeps doing it and gradually cools things down. This doesn't work for air that needs to be cooled in a second before it goes to the cylinders unless you have a AC unit the size of your car. That'll be too heavy and running it will suck up more power than the extra you produce. Ice slush and fridges aside, you also have rally cars -- and street rally car wannabes like the Lancer Evo -- featuring a water mister for the IC. This adds evaporative cooling to the basic air-to-air heat exchange. Doesn't quite take it to room temp, but helps with intercooler efficiency especially after a hard run that heat soaked it.

The other thing you can do of course is to not run that much boost! Firstly, turbos tend to be more efficient at about 1.8 Bar (0.8 bar / 11.8 psi of boost) than they are at 2.8 Bar (1.8 bar / 26.5 psi of boost). Consider the compressor map below. This is the map of a Honeywell-Garrett GTX2867R turbo -- possibly the most advanced and most efficient available with a peak efficiency of 79%. Airflow more or less scale linearly with RPMs barring major volumetric efficiency changes. So more or less you want to draw a horizontal line across that map to represent your turbo's thermal efficiency as the engine climbs in revs. Notice that the efficient contours are broader and the numbers are better if you draw that line at 1.8 vs say 2.8...

GTX2867R_816366-1_comp.jpg

Secondly, even if efficiencies are identical, increasing pressure by 13 psi heats it up half as much as taking it up to 26 psi. Remember, heating is proportional to both pressure rise and efficiency. I have always favored low boost designs for their low lag, greater linearity in relation to the throttle, higher efficiencies and higher static compression (which helps all of the above plus fuel economy). Don't think that low boost equals low power, it just means low torque peaks. You can make very serious power if your turbo can support high airflows. A 13 psi system with about 10.8:1 compression will respond almost like a high compression NA during cruise and gentle driving. But that' make about 1.1 lb-ft per liter. In otherwords a 2.0T running that combo will make about 220 lb-ft. Not, bad and in fact preferable in FWD cars to something in the 260~280 range if only because it is more controllable when the wheels both put pwoer to the ground and steer at the same time. If you mainatin 220 lb-ft to 6000 rpm, that's 251 hp. If you keep it there to 6600 rpm it's 277 hp and if you stretch it to 7200 rpm, it's 302 hp. Not bad at all. Plus the engine will drive like a high reving, 3.0 liter class six making 100 hp/liter.

Edited by dwightlooi
Posted

That is great info and very easy to understand. Thank you for taking the time to explain this for us.

Next big question, so how does this compare to a supercharger then? We have the intercooler and the supercharger sitting right on top of the engine, I realize there are those that attach to the front or the engine, but my understanding is the Whipple is still the most efficient supercharger around.

So comparing a Whipple supercharger with intercooler to a Garrett GTX2867R turbo with intercooler.

How does this stack/compare up?

Posted (edited)

That is great info and very easy to understand. Thank you for taking the time to explain this for us.

Next big question, so how does this compare to a supercharger then? We have the intercooler and the supercharger sitting right on top of the engine, I realize there are those that attach to the front or the engine, but my understanding is the Whipple is still the most efficient supercharger around.

So comparing a Whipple supercharger with intercooler to a Garrett GTX2867R turbo with intercooler.

How does this stack/compare up?

Well, that's an entire different topics, but to put things concisely...

(1) Superchargers can be very efficient in terms of adiabatic efficiency (thermal)

(2) Different types of superchargers have different efficiency and operating characteristics (more on that later)

(3) The biggest difference between a turbocharger and a supercharger is that it takes power to drive a supercharger it doesn't take power to drive a turbocharger

Screw type superchargers can be up to 80% thermally efficient. Roots types are about 70%. Centrifugals are about 75%. Remember those are peak numbers. For the purpose of determing static compression ratios you are looking more at minimum efficiency within the operating range than peak numbers which is about 15~20% lower (same thing for turbos)

Screw and centrifugal type chargers are efficient because they offer some degree of internal compression. That is to say that air exiting the supercharger is at a higher pressure than air entering it. Screws do that by progressively having bigger lobes and smaller hollows as you progress up the screw from intake to exit. Centrifugals are basically a turbocharger's compressor driven by a belt (usually with incremental gearing) so they compress air as well as move it by dynamically throwing the air outwards with the impeller at high speeds. Roots are "external" compression chargers. In otherwords, all it does is move air. It does not compress it at all. Air leaving a roots blower is the same pressure as atmospheric air. But because air is being pushed into a plenum, and the roots blower moves more into it than the engine consumes, air builds up pressure in the intake plenum.

Because of the way they work, screws are overall not only the most efficient but most consistently efficient over a wide operating speed and boost range, but they are also the heaviest and most expensive (because of the fancy machining needed on the progressive screws). Centrifugals are the cheapest as they are half a turbo with a belt. They are quite efficient at peak but have two major flaws. Firstly, boost is not linear the faster you spin it to more boost it makes. Hence, engines with centrifugals tend to be very peaky since they run boost that increases with rpms! Secondly, while centrifugals are very efficient at speed they are extremely inefficient at low rpms. This is why essentially no OEMs use centrifugals. Roots offer a good balance between cost and consistent boost output. The thing to remember about roots is that because they are external compression pumps, they are VERY EFFICIENT at low boost when the pressure difference between the output air and the plenum pressure is small, but very inefficient at high boost where the plenum is highly pressurized and atmospheric air is being pushed into it by the blower. In fact, at high boost air actually backflows significantly into the supercharger from the plenum before being pushed back out. One good thing about Roots is that air goes in from the front and comes out the top (or bottom) this allows for tighter packaging that screws!

To put things into perspective, an LS9 engine like that in the ZR1 make use up about 65~70hp just to drive the roots blower. This means that you are actually burning enough fuel and putting enough strain on the engine internals to make about 700~710hp but you are only getting 638 out of the engine. The rest is consumed by the supercharger itself. Hence, it is not difficult to understand why supercharged solutions are generally less fuel efficient and most have a bypass valve or a clutch to allow the blower to free wheel or be decoupled completely at cruise and low throttle positions.

Edited by dwightlooi
Posted (edited)

Anyway, the way Superchargers are playing out, Roots are winning in the market place...

Most force induced solutions are turbocharged. Of those that are supercharged, essentially all are roots based. Audi's 3.0 TFSI (despite the "T" in the nomenclature) is an Eaton TVS R1050 twisted rotor roots blower. Jagaur's 3.0 and 5.0 Supercharged engines in the XF, XJ and F-type are Eaton TVS roots blowers -- TVS R1050 and TVS R1650 respectively. GM's LSA and LS9 6.2 V8s in the CTS-V, Camaro ZL1 and Vette ZR1 are Eaton TVS roots blowers -- TVS R1900 and TVS R2300. Mercedes' 5.5 liter M113 V8s in the mid-2000s AMG cars were Lysholm screws, but those are all out of production.

I think the reasons are simple... they are cheap, they are reliable, they are linear. And if you run 7~8 psi of boost they are nearly as good as Lysholm screws. At 9 to 10 psi they start to show slight inefficiencies but not much... and at 12 or above screws shine. But most of these engines are NOT high boost designs. LSA at 9 psi and LS9 at 10.5 are actually the lowest boost users. The DOHC engines actually are a psi or so more. Audi's 3.0 TFSI runs at 11.5 psi

Edited by dwightlooi
  • 2 months later...
Posted

Update... as far as Roots Blower are concerned, the new state of the art is the 2nd Generation Eaton TVS. The first of these the TVS R1740 is being used in the LT4 engine powering the new Corvette Z06. The 2nd Generation TVS blowers operate at up to 20,000 rpm (33% faster than than the 1st generation TVS. The operating speed may have improved but, being external compression air pumps, roots blowers are still most efficient in low boost applications. Typically, they are most efficient at about 0.6 bar of boost (8.8 psi), they OK at 0.9 bar (13.2 psi), beyond that they get very inefficient very quickly. This is because air exiting the blower are still at atmospheric pressure. It is when they are packed into the plenum at a faster rate than the engine is consuming air that pressure builds. When the plenum is at high boost pressures air actually gets pushed back into the blower when those lobes open up to the plenum before being forced back out again -- this is highly inefficient and it gets worse with increasing plenum pressures.

Below are the compressor maps of two Eaton TVS blowers the R1900 (used in the CTS-V's LSA engine) and the R900 (used in Audis and Jaguars sixes). If you draw a horizontal line at 1.6 bar (corresponding 0.6 bar of boost pressure) you'll see that they are very efficient indeed. At 1.9 bar they start to miss the peak efficiency "island". At 2.5 bar they suck.

ct_127899.gif

ct_127897.gif

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...

Hey there, we noticed you're using an ad-blocker. We're a small site that is supported by ads or subscriptions. We rely on these to pay for server costs and vehicle reviews.  Please consider whitelisting us in your ad-blocker, or if you really like what you see, you can pick up one of our subscriptions for just $1.75 a month or $15 a year. It may not seem like a lot, but it goes a long way to help support real, honest content, that isn't generated by an AI bot.

See you out there.

Drew
Editor-in-Chief

Write what you are looking for and press enter or click the search icon to begin your search