While display resolution has a certain and direct impact on perceived picture quality, there are more important factors at play. Factors that 99% of consumers never consider when comparing TV's.
As many know, resolution is simply the amount of pixels a display contains in total. Manufacturers love this number, because it's BIG. Big numbers sell. What those manufacturers don't want you to realize is resolution plays a relatively small part RE: overall picture quality. Things such as motion handling, color accuracy, contrast ratio, local dimming zones/algorithm, IPS vs. VA... those are the true deciding factors one should look for when considering a new TV.
Even in 2022, one could EASILY compare a properly calibrated Pioneer Kuro plasma 1080p set vs. a current, calibrated 4k/HDR panel with each being fed their native resolution and be unable to tell the difference.
Streaming services have, in my opinion, all but negated the need for constant improvement in display tech. Currently, we're blessed with 4k resolution, HDR/Dolby Vision, WCG, local dimming (OLED excluded), a myriad of motion settings, etc. 2 of those directly apply to streaming content, all apply to the display that's showing said content.
Streaming services face a single major bottleneck: accessibility. Sure, Netflix can claim that they stream in "UHD with HDR", but not all video is created equal. All it takes is a peek at the metadata. A Netflix 4k HDR stream tops out at less than 20Mbps/s, while a physical UHD disc can easily hit 40-50Mbps. Therein lies the truth: you can compress anything down to adapt it to your needs, but there WILL be sacrifices.
As long as streaming is the de-facto for the majority of homes, I believe that the need for further improvement regarding consumer display's has reached a dead end.