A long, long time ago cars survived with carburetors that dumped excess gasoline into the intake system because of less than precise fuel metering often opting for too rich to allow the engine to run smoothly. It's true. And the gasoline mixed in the oil would evaporate as the engine heated up the next time it was driven. Oil change intervals were 3K miles and before that 2K miles and that worked well enough to take care of the dilution.
The problem with trying to figure out if fuel dilution today is really a problem is that you start with lab testing that may be trend accurate and not sample accurate. This means that an error on a test is about the same error on every test so the trend is more accurate than any single test. More than one test would be good.
Next is the idea that the lab test is a snapshot and does not reflect an average or capture the highest percentage. There might also be periods of driving where the percentage is really bad but clears up as the workload increases. In other words, you might have a problem that would never be detected by lab work. You have to speculate that there might be a problem.
What we need is some help from BITOG members to figure out a realistic plan for determining the extent of fuel dilution that might include driving conditions, driver habits, engine configuration and lab results. Next would be, how to reduce the problem knowing that eliminating the problem completely is not feasible. The solution might include modification of driving habits, oil selection, oil change intervals, an occasional Italian tuneup, gasoline additives and lab work.
Only those OCD owners would even think about something like this but this is what makes BITOG the Website that it is today.