Cold start engine wear?

Status
Not open for further replies.
There have been many studies showing increased wear when the oil is cool or cold. There are a variety of potential causes, like fuel and water condensing on cylinder walls, increased acidic corrosive wear, anti wear additives being less active, ill-fitting parts due to dissimilar expansion rates, less than full-film thickness on cylinder walls, etc. I'm talking about wear during warmup phase, not just the very brief start-up event. Here is one study that focused on valve-train wear: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.924.5777&rep=rep1&type=pdf
 
Seems they are building cars to stop/start to the point it's annoying as can be … (so far just several rent cars for me).

Our Fusion Hybrid will EV down the first few blocks out the driveway - and then you catch a 50 mph loop when the 2.0L cold starts and joins the 0-50.
I bumped it up to 5w30 at 100k and change the oil at around 8k/nine months. That motor stops/starts no matter where you drive.
 
Originally Posted by JAG
There have been many studies showing increased wear when the oil is cool or cold. There are a variety of potential causes, like fuel and water condensing on cylinder walls, increased acidic corrosive wear, anti wear additives being less active, ill-fitting parts due to dissimilar expansion rates, less than full-film thickness on cylinder walls, etc. I'm talking about wear during warmup phase, not just the very brief start-up event. Here is one study that focused on valve-train wear: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.924.5777&rep=rep1&type=pdf

Perfect example of misunderstanding of study results. (no reflection upon you personally, JAG).
This study linked above is essentially a HALT test.
The general protocol is that they ran an engine under prescribed conditions and then measured the wear on the valve-train. OK - fine by me so far.
But it's the way (the method) they did it, they way they had to do it, to induce the result they sought. Read on ...
They first ran the engine at 1500 rpm for 40 hours with the oil held at 40 deg C, then then they ran the engine at 3000 rpm for 60 hours with the oil at at 100 deg C. The test is a continuous run for 100 total hours. (Note, coolant was also held near the oil temps as well, so the engine oil and coolant were within 5 deg of each other).

Now, I don't know about you and your car/truck/tractor/generator/whatever, but my engine does NOT run at "cold" oil temps for 40 straight hours. Think about it ... The reason the oil wear analysis showed up to 5x more wear is because THEY RAN THE ENGINE FOR 40 HOURS WITH THE OIL TEMP AT 40C (104F) and then contrasted that wear rate to running the engine at 100C (210F) for 60 hours. (They also reversed the test; running 60 hours hot oil and then 40 hours cold oil).

Typically, any decent water-cooled engine has a thermostat, and that allows the engines to come up to temp fairly quickly, even in winter. That warms the oil up to a point where it's doing it's job the best it can. I WANT my oil around 200F - 225F; it's supposed to be there. The reason this linked test is not really valid is because it does not, in any manner, represent the reality of what happens in our vehicles.

HALTS are great for revealing things they are designed to reveal. Be it this "cold oil" test, the infamous GM filtration study, etc. All these "tests" are typically designed to reveal a disparity between two or more options of some conditional variable. That's fine. But you MUST stop and ask yourself: do the conditions of the test actually represent the reality of your world? If they don't, then the results of that test will not manifest into reality in your garage.

If all you ever did with your vehicle was start it up, move it from the garage to the driveway in the morning at shut it down after 30 seconds, and then reverse this process at night to put the car away, then the claim of 5x more wear from "cold" oil might be valid.
But most of us don't do that every single day. In fact, we rarely do that. Typically, even if the cabin does not get warm, the engine will get warm in a few minutes of normal driving. We not only start our engines, but we drive them too! And once it's under load, even moderate driving warms the engine coolant/oil fairly quickly. Additionally, the thinner lubes tend to come up to their operating vis sooner; hence a lower wear rate anticipated sooner.

IMO, one of the worst things you can do it start your very cold engine and let it "warm up" in winter, because it takes longer for the coolant and oil to come up to temp. Just let her idle for as long as it takes to get the "flare" idle down (that which warms the cats up), and then per her in "D" and go about your business in a moderate manner.

I don't disagree that cold oil and cold engines experience a bit more wear, but it has to be kept in perspective. That is typically unavoidable, and also completely ignores the topic of the TCB (tribochemical barrier) effect also. As oil matures, it oxidizes, and as that oxidation is laid down, it coats the surfaces (SAE study 2007-01-4133 from Ford and Conoco). The HALT study which JAG linked does not address the topic of TCB, and if they were constantly introducing fresh oil, they were also stripping away the TCB each and every time, and so bare unprotected metal is more susceptible to "wear" than metal coated by the TCB.



Again - my data shows that this is a completely overblown (and often misunderstood) topic. Real world wear-rate data does not lie. I don't care about what happens in the lab. I care about what happens in our collective driveways, and over 15,000 UOAs cannot lie; start-up wear is a moot point because the conditions that contribute to that kind of wear are very low in total quantity of operational hours.
 
Last edited:
Originally Posted by dnewton3
The whole topic of high wear at start up is total bunk in today's engines.
Macro data shows it's an overblown topic. There's zero proof that it's an issue, and actually hasn't been for many, many years.
There are short-trip commuters that see lots of cold start cycles and they might have 5x the starts that my car sees (I typically only start it 2x a day; once in to work and once home - 30 miles each way).
Are their wear rates 5x mine? No way in Hades.

We had an 1995 Villager back in the day; got it new and ran it to 245k miles. Ran UOA experiments on it trying to understand how OCI duration and use affect wear. It was the wife's van and when the kids were young it was the perfect example of the quintessential soccer mom ride. Short trips, lots of cold starts, etc. It should have been the perfect example of high-wear, if you believe the oil companies and the owner's manual (3k miles for "severe" service were the recommendation). I wanted to experiment, though ... actually started moving out to 5k mile OCIs; no resulting wear issues. Then 10k miles; still no wear issues. Ran some 15k mile OCIs; again no issues. The wear rates were coming down as the OCIs went up. And ... We would take it out to AZ from IN every other Thanksgiving to see the in-laws. Long hauls of 12 hours at steady state; often a 5k mile round trip with all the various side trips of sight seeing. Guess what ... ???
The wear rates never budged; they were what they were. Didn't matter than I ran longer OCIs. Didn't matter than we change operational patterns. The wear rates were fairly steady. There is always variation involved; things are not totally stagnant. But the amount of "normal" variation FAR exceeds the variation from the type of service factors typically induced in use. Life happens, and the engines generally don't care. They shed what they shed, and you're not typically doing much to alter it.

In all the macro data I have, there is zero proof that staring an engine repeatedly makes for a lot of wear. That might have been true 50 years ago when oils were not great and manufacturing of engines was not not great in terms of tolerance controls and machined surfaces. However engines made in the last few decades, as well as the improvements in lubrication, just make this a non-issue; it's a topic that is based in the same age-old rhetoric as is the 3k mile OCI.

If someone has proof (real world data that is not just hearsay) I'll entertain the conversation. But all the facts I have seen show it's a moot point. After all, I have over 15,000 UOAs from all manner of engines and applications; cars, trucks, tractors, generators, etc. Use in FL, AK, AL, CA, MN, ME, TX, AZ, etc. You name it, I've probably got a UOA that represents it well. And the data tells me that multiple starts does NOT induce a shift in wear rates that is discernible in the grand scheme of use. There are a few exceptions I have seen, but even the shift in wear is not enough to make a mountain out of the mole-hill. I've seen wear rates go up as much as 50%, but you have to keep that in perspective ... when an engine averages 2ppm/1k miles, then the use pattern shifts to 3ppm/1k miles, it's not like that engine is suddenly going to blow up, or fail in the next decade. Wear rates today are so low on most engines that even major shifts in wear (again, these are rare) still do not result in wear rates high enough to be of any concern whatsoever. Your car will either rot or be destroyed in a wreck far sooner than the engine will die from wear cause by start-up.
 
Originally Posted by dnewton3
The whole topic of high wear at start up is total bunk in today's engines.
Macro data shows it's an overblown topic. There's zero proof that it's an issue, and actually hasn't been for many, many years.
There are short-trip commuters that see lots of cold start cycles and they might have 5x the starts that my car sees (I typically only start it 2x a day; once in to work and once home - 30 miles each way).
Are their wear rates 5x mine? No way in Hades.

We had an 1995 Villager back in the day; got it new and ran it to 245k miles. Ran UOA experiments on it trying to understand how OCI duration and use affect wear. It was the wife's van and when the kids were young it was the perfect example of the quintessential soccer mom ride. Short trips, lots of cold starts, etc. It should have been the perfect example of high-wear, if you believe the oil companies and the owner's manual (3k miles for "severe" service were the recommendation). I wanted to experiment, though ... actually started moving out to 5k mile OCIs; no resulting wear issues. Then 10k miles; still no wear issues. Ran some 15k mile OCIs; again no issues. The wear rates were coming down as the OCIs went up. And ... We would take it out to AZ from IN every other Thanksgiving to see the in-laws. Long hauls of 12 hours at steady state; often a 5k mile round trip with all the various side trips of sight seeing. Guess what ... ???
The wear rates never budged; they were what they were. Didn't matter than I ran longer OCIs. Didn't matter than we change operational patterns. The wear rates were fairly steady. There is always variation involved; things are not totally stagnant. But the amount of "normal" variation FAR exceeds the variation from the type of service factors typically induced in use. Life happens, and the engines generally don't care. They shed what they shed, and you're not typically doing much to alter it.

In all the macro data I have, there is zero proof that staring an engine repeatedly makes for a lot of wear. That might have been true 50 years ago when oils were not great and manufacturing of engines was not not great in terms of tolerance controls and machined surfaces. However engines made in the last few decades, as well as the improvements in lubrication, just make this a non-issue; it's a topic that is based in the same age-old rhetoric as is the 3k mile OCI.

If someone has proof (real world data that is not just hearsay) I'll entertain the conversation. But all the facts I have seen show it's a moot point. After all, I have over 15,000 UOAs from all manner of engines and applications; cars, trucks, tractors, generators, etc. Use in FL, AK, AL, CA, MN, ME, TX, AZ, etc. You name it, I've probably got a UOA that represents it well. And the data tells me that multiple starts does NOT induce a shift in wear rates that is discernible in the grand scheme of use. There are a few exceptions I have seen, but even the shift in wear is not enough to make a mountain out of the mole-hill. I've seen wear rates go up as much as 50%, but you have to keep that in perspective ... when an engine averages 2ppm/1k miles, then the use pattern shifts to 3ppm/1k miles, it's not like that engine is suddenly going to blow up, or fail in the next decade. Wear rates today are so low on most engines that even major shifts in wear (again, these are rare) still do not result in wear rates high enough to be of any concern whatsoever. Your car will either rot or be destroyed in a wreck far sooner than the engine will die from wear cause by start-up.



So what I am learning on bitog is that as long as you use a good quality oil that meets specs and a good filter there is not a whole lot of fuse with these modern engines. Going to have to find something else to be obsessed about!
 
Quote
Perfect example of misunderstanding of study results.

I don't see how that is the case when I examine my own thoughts and certainly don't see what I said that would indicate a lack of understanding. Anyhow, that's not worth further discussion since my level of understanding of the tests is a pointless distraction.

I used to have a car with an oil temperature gauge. It would typically take 15 to 20 minutes for the oil temperature to stabilize. Many other posters here with oil temp. gauges have reported similar findings.

Quote
I don't disagree that cold oil and cold engines experience a bit more wear, but it has to be kept in perspective.

I'm glad you agree that cold oil and engines experience more wear. That was my main point. How much it matters is important and is the challenging question to answer because there are so many factors at play.

Quote
I don't care about what happens in the lab.

I think that is detrimental to seeking facts. Anything that aids in finding the truth should be welcomed. Lab tests allow factors to be relatively tightly controlled and the measurements can be made by many relatively precise methods. One of their weaknesses tends to be a relatively small number of tests, preventing varying the various factors a large number of times. Real world testing also has its pros and cons. Relying on lab tests and real world tests is better than relying only on one of those types.

I prefer to rely on data but I will deviate from that and just give my thoughts on a "what if". Suppose a particular new car went 100k miles, consisting only of 5 mile trips, with complete cool downs between runs. Now suppose time could be reversed to have that exact car back as new, then it was driven almost in one run of 500k miles, except stopping for refueling, oil changes, and air filter and spark plug changes. Assume that the rest of the car besides the engine magically never needed anything replaced, for the purpose of keeping this car on the road other than refueling and the above-mentioned engine maintenance. I would not be surprised if the engine that went 100k in 5 mile trips was approximately as worn out as the one that went 500k miles with minimal stoppages. But such things are guesses and the only benefit of us making them are for entertainment value.
 
I ran that perfect oil change interval last summer with Duron SAE 40. 9,500 miles in just over 200 hours. UOA showed 6 ppm of Fe. I believe that the data is moot, as most of the cylinder wear metal goes out the exhaust pipe and not into the oil.
 
JAG - first of all, I want to make sure I clearly state that I do not personally attribute the misunderstanding to you; it was a general comment on how people at large will skim over a study and not understand it, or realize the implications of how/why it was done and consider if it's truly applicable. That is what I meant when I said "no reflection on you personally". Sorry if I offended you; wasn't my intent.

I work with HALTs all the time. They have merit in that they show a relative disparity of performance between two or more competing characteristics. But that should NEVER be seen as easily and totally transferable to the real world. HALTs create conditions purposely to highlight and reveal these disparities. However, it is rare that those conditions actually are represented in the real world.

In the study you linked, it's clear that those wear conditions based on oil temps in no way represent how most people operate their engines. The one exception I can think of would be a small HP water cooled marine outboard engine, when the intake water cooling temp is always cold using the lake/sea as an open cooling loop; it never heats up. But I cannot comment on the oil temps; don't know how those are controlled in those type engines. However, larger marine engines have closed loop systems which essentially operate not unlike a land-based water cooled ICE. The VAST majority of conversations we have here on BITOG revolve around land-based power-driven stuff. And so testing wear for 40 straight hours at 104F is nowhere near what my reality is. And so that revelation of 5x more wear is also not applicable. Frankly, that study you linked is completely worthless to me, and just about everyone here. Sure - it's a factually driven study that shows results; I would not disagree. But where we likely separate our interests is that I see a HALT for not only what it is, but more importantly what it isn't. That oil-temp wear study is completely and utterly useless to me, because even they typical "soccer mom" vehicle gets up to temp even during short drives; quicker than you think. And sure as the sun rises each morning, there is not one person here to operates their engine for 40 straight hours at 40C.

This is not unlike my objections to the GM filter study that so many people and companies point towards regarding the "need" for the best filtration. If you've actually read that study, and reasoned through it's methodology, it too has no bearing on my life. In that study, they ran an engine with various pore-size filters, to "prove" that finer filtration is better. Then ran the engines for 8 straight hours, and dumped in 50 grams of fine dust each hour, and only changed the filters upon a dP of 20 psi, and never changed the oil at all. Now, an unknowing person may not question that method, but I'm an informed person, and I realize that the 50 grams of dust every hour equates to a typical ingestion of the equivilent of 570k miles of typical driving over those 8 hours. Think about that for a moment ... They ran FIVE-HUNDRED-SEVENTY-THOUSAND MILES of dirt into the engine, and yet never changed oil and only changed filters when the dP got to 20 psi. They started with filters that were 40um nominal, and worked their way down in these "tests". That 40um filter was their baseline starting point for wear. They then claimed that the improvement of filtration could be 8x "better" by using a 7um filter. Now, I cannot attest to what you do, JAG, but I don't run my engine for 570k miles on only 1 OCI with a 40um filter. Essentially they set an unnaturally high base-line, and then used an absurd OCI scheme to induce the wear to a point where they could actually discern differentials in performance disparity. And here's the kicker ... in summary of the study near the end of the report, they clearly admitted this is not "normal" and real world use would never see this kind of performance disparity. They knew what they did would never reveal such a "better" filter choice in your garage, because the HALT was so over-the-top in it's design. They flat out stated that we'll never see that kind of wear reduction using "better" filters, but that does not stop people from singing the praise of that study. The reality is that GM purposely chose to control only one variable (filter efficiency) and so they dutifully ignored other aspects of wear control like oil changes. To be "normal", they would have had to change the oil in that study every 5 minutes; a complete system drain/fill would have not only taken way too much time and cost, but also completely masked what they intended to prove.

If you REALLY want to know how much wear is attributed to start up and low-oil-temps, the correct way to do a study would be to do a DOE with typical use items and products, and then compare/contrast the wear trends relative to the true operational conditions which all normal life presents. You know ... like my UOA data study using 15,000 samples ...

Lab tests can help illuminate how/why things happen, but that does NOT, in any way, shape or form, mean that the lab conditions actually represent what happens in real life at all times, and when it's a HALT test, it's almost an assurance that the lab results will not manifest into reality in your garage.
 
Last edited:
Originally Posted by JAG
Thank you Mr. Newton. That is all very rational.


+1,000
01.gif
 
Originally Posted by dnewton3
JAG - first of all, I want to make sure I clearly state that I do not personally attribute the misunderstanding to you; it was a general comment on how people at large will skim over a study and not understand it, or realize the implications of how/why it was done and consider if it's truly applicable. That is what I meant when I said "no reflection on you personally". Sorry if I offended you; wasn't my intent.

I work with HALTs all the time. They have merit in that they show a relative disparity of performance between two or more competing characteristics. But that should NEVER be seen as easily and totally transferable to the real world. HALTs create conditions purposely to highlight and reveal these disparities. However, it is rare that those conditions actually are represented in the real world.

In the study you linked, it's clear that those wear conditions based on oil temps in no way represent how most people operate their engines. The one exception I can think of would be a small HP water cooled marine outboard engine, when the intake water cooling temp is always cold using the lake/sea as an open cooling loop; it never heats up. But I cannot comment on the oil temps; don't know how those are controlled in those type engines. However, larger marine engines have closed loop systems which essentially operate not unlike a land-based water cooled ICE. The VAST majority of conversations we have here on BITOG revolve around land-based power-driven stuff. And so testing wear for 40 straight hours at 104F is nowhere near what my reality is. And so that revelation of 5x more wear is also not applicable. Frankly, that study you linked is completely worthless to me, and just about everyone here. Sure - it's a factually driven study that shows results; I would not disagree. But where we likely separate our interests is that I see a HALT for not only what it is, but more importantly what it isn't. That oil-temp wear study is completely and utterly useless to me, because even they typical "soccer mom" vehicle gets up to temp even during short drives; quicker than you think. And sure as the sun rises each morning, there is not one person here to operates their engine for 40 straight hours at 40C.

This is not unlike my objections to the GM filter study that so many people and companies point towards regarding the "need" for the best filtration. If you've actually read that study, and reasoned through it's methodology, it too has no bearing on my life. In that study, they ran an engine with various pore-size filters, to "prove" that finer filtration is better. Then ran the engines for 8 straight hours, and dumped in 50 grams of fine dust each hour, and only changed the filters upon a dP of 20 psi, and never changed the oil at all. Now, an unknowing person may not question that method, but I'm an informed person, and I realize that the 50 grams of dust every hour equates to a typical ingestion of the equivilent of 570k miles of typical driving over those 8 hours. Think about that for a moment ... They ran FIVE-HUNDRED-SEVENTY-THOUSAND MILES of dirt into the engine, and yet never changed oil and only changed filters when the dP got to 20 psi. They started with filters that were 40um nominal, and worked their way down in these "tests". That 40um filter was their baseline starting point for wear. They then claimed that the improvement of filtration could be 8x "better" by using a 7um filter. Now, I cannot attest to what you do, JAG, but I don't run my engine for 570k miles on only 1 OCI with a 40um filter. Essentially they set an unnaturally high base-line, and then used an absurd OCI scheme to induce the wear to a point where they could actually discern differentials in performance disparity. And here's the kicker ... in summary of the study near the end of the report, they clearly admitted this is not "normal" and real world use would never see this kind of performance disparity. They knew what they did would never reveal such a "better" filter choice in your garage, because the HALT was so over-the-top in it's design. They flat out stated that we'll never see that kind of wear reduction using "better" filters, but that does not stop people from singing the praise of that study. The reality is that GM purposely chose to control only one variable (filter efficiency) and so they dutifully ignored other aspects of wear control like oil changes. To be "normal", they would have had to change the oil in that study every 5 minutes; a complete system drain/fill would have not only taken way too much time and cost, but also completely masked what they intended to prove.

If you REALLY want to know how much wear is attributed to start up and low-oil-temps, the correct way to do a study would be to do a DOE with typical use items and products, and then compare/contrast the wear trends relative to the true operational conditions which all normal life presents. You know ... like my UOA data study using 15,000 samples ...

Lab tests can help illuminate how/why things happen, but that does NOT, in any way, shape or form, mean that the lab conditions actually represent what happens in real life at all times, and when it's a HALT test, it's almost an assurance that the lab results will not manifest into reality in your garage.





If you were to take an informed guess at the wear difference, what would you guess? I used to drive 45 minutes each way on the highway Monday to Friday and that was the majority of my driving. Now I drive 6-7 minutes to work in the city, stop and go. I have no idea on oil temperature but my coolant gets up to 180 and thermostat opens at the last light before I arrive at work. My oil definitely gets dirty looking a lot faster than it used to and I change it by 3k miles or 6 months roughly. It's a carbureted 1976 Olds 350 v8.

The carbureted 305 sbc in my winter car has it even worse. -25c cold starts sometimes in the winter and the same short drive.
 
Just off the cuff, I would go with whatever 5W-40 was the cheapest at the time and keep the short interval. Easier to start but a robust weight for an old engine.
 
Originally Posted by caprice_2nv
If you were to take an informed guess at the wear difference, what would you guess? I used to drive 45 minutes each way on the highway Monday to Friday and that was the majority of my driving. Now I drive 6-7 minutes to work in the city, stop and go. I have no idea on oil temperature but my coolant gets up to 180 and thermostat opens at the last light before I arrive at work. My oil definitely gets dirty looking a lot faster than it used to and I change it by 3k miles or 6 months roughly. It's a carbureted 1976 Olds 350 v8.

The carbureted 305 sbc in my winter car has it even worse. -25c cold starts sometimes in the winter and the same short drive.


The old SBC engines have very high wear rates. It would be difficult to "guess" what your rate is relative to your current driving conditions. Engines that old are not included in my general comments; I speak of the engines from the last three decades which represent modern machining methods, fuel control, etc. By the time the early 1990s rolled around, there were massive improvements in engine design, manufacturing controls and lubricants. Your 43 year old engine is not in that class.

If you got some UOAs, you'd know, instead of having to "guess". I have the macro data for the old SBC engines. However I'm not sure if your Olds 350 is the same design as the SBC from that era.
 
Last edited:
Originally Posted by dnewton3
Originally Posted by caprice_2nv
If you were to take an informed guess at the wear difference, what would you guess? I used to drive 45 minutes each way on the highway Monday to Friday and that was the majority of my driving. Now I drive 6-7 minutes to work in the city, stop and go. I have no idea on oil temperature but my coolant gets up to 180 and thermostat opens at the last light before I arrive at work. My oil definitely gets dirty looking a lot faster than it used to and I change it by 3k miles or 6 months roughly. It's a carbureted 1976 Olds 350 v8.

The carbureted 305 sbc in my winter car has it even worse. -25c cold starts sometimes in the winter and the same short drive.


The old SBC engines have very high wear rates. It would be difficult to "guess" what your rate is relative to your current driving conditions. Engines that old are not included in my general comments; I speak of the engines from the last three decades which represent modern machining methods, fuel control, etc. By the time the early 1990s rolled around, there were massive improvements in engine design, manufacturing controls and lubricants. Your 43 year old engine is not in that class.

If you got some UOAs, you'd know, instead of having to "guess". I have the macro data for the old SBC engines. However I'm not sure if your Olds 350 is the same design as the SBC from that era.


The olds engine is completely different design than the SBC. My SBC has 235k miles on it and is just starting to get a little worn. It leaks a lot and has some cold start piston slap in the colder weather, but only uses about a QT in 3000 miles.

The olds uses about the same amount of oil but has no bottom end noise or leaks. They're known for going over 300k miles easily with nothing but a timing chain replacement. I don't know about the wear rates (never seen a uoa on one) but I assume the wear rates in a uoa don't translate to real life mileage expectancy either (based on the amount of small block Chevy's that have gone a ton of miles).
 
Status
Not open for further replies.
Back
Top