I took 2 samples for my most recent UOA, sending one to Blackstone and the other to a guy I know who works in another lab. Results from the 2 labs were close except for one parameter: fuel dilution.
NOTE: The numbers below are in percent, but the message board won't let me use the percent character.
Blackstone measured under 0.5, lab X measured 2.1
I called blackstone and they said they estimate fuel dilution based on flashpoint. Lab X uses ASTM D3525M (whatever that is). Lab X says flashpoint is an inaccurate way to estimate FD because oils can boil off the low BP components giving higher flash points and false negatives on FD, or they can decompose into lower FP components giving false positives on FD.
Lab X says 2.1 is "normal" for D3525M measurements on oil from motorcycle engines like mine, and Blackstone says 0.5 is normal for their measurement, so either way there's no problem. But there is a big difference between Blackstone's 0.5 and Lab X's 2.1.
Any insight from the board here? Thanks,
NOTE: The numbers below are in percent, but the message board won't let me use the percent character.
Blackstone measured under 0.5, lab X measured 2.1
I called blackstone and they said they estimate fuel dilution based on flashpoint. Lab X uses ASTM D3525M (whatever that is). Lab X says flashpoint is an inaccurate way to estimate FD because oils can boil off the low BP components giving higher flash points and false negatives on FD, or they can decompose into lower FP components giving false positives on FD.
Lab X says 2.1 is "normal" for D3525M measurements on oil from motorcycle engines like mine, and Blackstone says 0.5 is normal for their measurement, so either way there's no problem. But there is a big difference between Blackstone's 0.5 and Lab X's 2.1.
Any insight from the board here? Thanks,