Basically lower the piston speed the higher the load to accomplish a certain amount of work. Also engines have a sweet spot determined by cam grind and well matched engine components, final drive etc.
More cycles creates more wear in a given time if load is identical, but thanks to Moft there isn't really a legitimate linear equation I've been able to find.
I would guess the least engine wear is at the minimum RPM that would result in the optimum/max hydrodynamic lubrication. so it depends on many different factors and variables ...
Just a guess, I ain't no expert!
Yes, and stress on components that are subject to significant inertial loading goes up with the square of speed.... remember that only the journals have MOFT, everything else is a mixture of mixed and boundary so we have to further qualify wear into which part or region are we referring to or a machine as a whole. ...
Very interesting, I have never heard that before. I wonder how true that is.I've heard that an engine will run "forever" at 80% load at 80% of redline RPM.
Too little load, like revving in neutral, is said to be bad b/c the con-rods stretch out. They wouldn't if they had a beefy a/f mixture to compress.
I bet it would. It would also depend a lot on what the weak link on said engine is. For example, if an engine is known to wear out its timing chain early, a lower RPM would most likely be better since the chain is directly affected by RPM and not directly affected by load. However, if the engine in question was known for wearing out rod bearings early, I suppose a bit shorter gearing would be better (within reason obviously) to keep the load on the bearings down.Both motors are producing the same horsepower, one sees higher rpm, the other sees higher torque. Don't know the answer but I would think it would depend alot on the design of the motor.
Very interesting, I have never heard that before. I wonder how true that is.
I've heard that an engine will run "forever" at 80% load at 80% of redline RPM.