I'll partially agree with Gary on this one.
Startup wear is caused by the oil being below the various vapor and sublimation points of combustion byproducts, build up of unburnt hydrocarbons which aggregate while the oil is cold and are not boiled off, acidic attack of the engine metal due to oxidation of the oil and hydrobromic acids which aren't boiled off until the engine is warmed above 140F, AND insufficient lube at start up. Of that, approximately half is due to insufficient lube at start up.
A cold start can cause between 200-1000 miles of wear (normalized by the wear rate with warm engine, with clean, good oil on a flat highway at constant highway speeds). The actual wear factor depends not only on how many times you started your car, but how the temperature ramped up, and how long the total trip was (was it long enough to boil off combustion byproducts, etc.). If you don't idle much, and warm fairly quickly, the start could be as little as 100 miles of wear, but if the initial temperature was low, you idled to warm, you take a very short trip, and your oil is below average, wear can be 1000 miles/startup. If you subtract off the metal desolution rates from the non-load mechanisms you still get 100-500 miles of wear from insufficient lube at start up.
So with about half the wear from insufficient lube at start up, you obviously want to select a viscosity of oil that will decrease the amount of time it takes for the engine to be lubricated "sufficently".
Now here's the million dollar question. How much can you influence how quickly "the proper" lubrication is established with a viscosity change in oil? For example, if at cold start one oil has a viscosity of 700 cSt and another is 3500 cSt how much faster will "the proper" lubrication be established with the 700 cSt oil? Now factor in different engine designs, and it'll probably even more difficult to answer the question at hand with pin point accuracy.
By the way, here's a
picture of Gary in real life.
![[Big Grin]](images/icons/grin.gif)