First, I acknowledge that I'm beyond my skill level on this subject, hence the question.
There seem to be two UOA methods for determining fuel dilution of engine oil: 1) inferring a dilution % based on an oil sample's flashpoint (e.g. Blackstone), and 2) gas chromotography (e.g. Polaris). Gas chromotography seems to be regarded as the more precise method and typically shows a higher level of fuel contamination than the flashpoint method.
As I understand gas chromotography, it analyzes and categorizes sample vapors based on molecule length, or basically the number of carbon atoms. Gasoline has fewer carbon atoms than engine oil and is segregated on this basis. But the difference in the number of carbon atoms in gasoline isn't radically different from engine oil and I'm wondering if the GC method is as precise as advertised. For example, doesn't normal mechanical shearing essentially "shear" carbon atoms from engine oil molecules, thus creating the opportunity for these sheared molecules to appearing same general category as some gasoline molecules?
This is relevant to me because, as I've posted here before, my 2015 Honda CRV has serial UOAs from Polaris showing fuel dilution of >5%. One UOA from Blackstone showed a flashpoint of 360 F and an inferred fuel dilution of about 1%. More than 5% dilution seems extreme, and looking through Blackstone UOAs this would translate into a flashpoint of 260 F or so, which seems absurdly low. As Honda can't find anything wrong with my CRV I'm left to wonder if (hope?) the measurement method is to blame. And of course, how much precision can we expect from a $20-$30 analysis?
As noted, I'm far beyond my level of competence here so be gentle.
There seem to be two UOA methods for determining fuel dilution of engine oil: 1) inferring a dilution % based on an oil sample's flashpoint (e.g. Blackstone), and 2) gas chromotography (e.g. Polaris). Gas chromotography seems to be regarded as the more precise method and typically shows a higher level of fuel contamination than the flashpoint method.
As I understand gas chromotography, it analyzes and categorizes sample vapors based on molecule length, or basically the number of carbon atoms. Gasoline has fewer carbon atoms than engine oil and is segregated on this basis. But the difference in the number of carbon atoms in gasoline isn't radically different from engine oil and I'm wondering if the GC method is as precise as advertised. For example, doesn't normal mechanical shearing essentially "shear" carbon atoms from engine oil molecules, thus creating the opportunity for these sheared molecules to appearing same general category as some gasoline molecules?
This is relevant to me because, as I've posted here before, my 2015 Honda CRV has serial UOAs from Polaris showing fuel dilution of >5%. One UOA from Blackstone showed a flashpoint of 360 F and an inferred fuel dilution of about 1%. More than 5% dilution seems extreme, and looking through Blackstone UOAs this would translate into a flashpoint of 260 F or so, which seems absurdly low. As Honda can't find anything wrong with my CRV I'm left to wonder if (hope?) the measurement method is to blame. And of course, how much precision can we expect from a $20-$30 analysis?
As noted, I'm far beyond my level of competence here so be gentle.