Have to be careful with context when talking about the correlation between HTHS viscosity and engine wear. It has to be looked at under the same conditions, like for instance the oil being the same temperature and under the same shear rate in a bearing or between two moving parts. An oil with less HTHS viscosity under those same operating conditions will result in a smaller film thickness between the moving parts, and therefore have the possibility of more wear. Wear control is all about separating moving parts with an adequate film thickness between parts to minimize rubbing and wear, and also the film strength of the oil which is what mitigates wear when two moving surfaces start rubbing on each other when the film thickness goes to zero.
Of course if you have some 0W-8 and run it though an effective oil cooler to keep the oil temperature down, then it's gong to result in more viscosity and therefore more film thickness between parts. That's just making the operating hot viscosity higher by keeping the oil cooler. But if you ran that 0W-8 at 250F operating temperature it's not going to provide the same film thickness between moving parts as say a 0W-40 or any other thicker oil would at that same 250F operating temperature. That's the context that needs to be used when comparing engine protection vs the HTHS viscosity factor.
Studies do show that more wear can result in most engine components as the HTHS viscosity gets lower ... typically when it starts getting around 2.4 cP and below. This is why engines that specify 0W-16 and lower oil grades use some special engine design features to help decrease wear when the film thickness becomes lower as the viscosity becomes lower.
View attachment 288697