That paper was based on bench tests, as are the other SAE tests along these lines. It's been a long time since I've read the paper, but as I recall, the results are not derived directly from the engines the oils were run in. The oils were collected from the subject cars and then run in test engines. In order to get this info, the cams were cleaned of any residual anti wear layers. Valid data could only be had by using isotope doped cams, not ppm from a UOA.
There is data in the literature to explain this, as the glass like anti-wear layers are produced by the daughter products of the ZDDP being broken down. The used oil contains a higher percentage of the daughter products and can more quickly put down a hard, effective anti-wear layer on bare metal. It doesn't apply to you and I, as we don't clean off the anti-wear layer at each OCI. It's an important piece of the overall lubrication picture, but isn't an end unto itself.
If I'm confusing this study with others I've read, please feel free to correct me.
Quote:
Ya right. Point me to such UOA that has that 100ppm of Fe.
Take an engine that sheds 1.0ppm Fe/1000 miles and run it to 10,000 miles. The oil now contains 10 ppm Fe. Do an oil change, which typically has at least a 10% carry-over of the old oil. Sample every 1000 miles.
This will be the results:
1K = 2.00 ppm/1k
2K = 1.50 ppm/1K
3K = 1.33 ppm/1K
4K = 1.25 ppm/1K
5k = 1.20 ppm/1K
6K = 1.17 ppm/1K
7K = 1.14 ppm/1K
8K = 1.12 ppm/1K
9K = 1.11 ppm/1K
10K = 1.10 ppm/1K
It would appear that new oil causes more wear, when in reality the wear rate was a constant 1.0 ppm/1K from the first mile driven.
Better?
Ed