50 years? Nope.
I spent 20 years in Semi Mfg Equipment, which supplies the machines that process wafers into chips.
For you scientists, the current tech node is 5nm. That means the smallest traces (wires) on the most dense layer of the chip, is 5 billionths of a meter.
You may wanna Google 5nm 3D NAND memory...
Anyways, using this science takes years to develop into a technology.
The technology evolves constantly; R&D is costly.
Companies (Intel, Samsung, etc) wanna get this stuff out to consumers.
Now, the government does have super scientists (and programmers!) using available technology.
At any cost...
Beyond that, companies you know develop and sell to the government under confidentiality agreements.
They might be years ahead of other countries in terms of utilization of technology, but you're right that there's certainly no way their electronics manufacturing technology is somehow ahead of Intel, Samsung, or even TSMC. They don't really do anything other than buy commercial products and customize them. However, these days FPGA technology has made it possible to make high performance custom parts in small quantities and for specific purposes. It can be done in software, but not as efficiently as hardware. In my experience I remember hard coding an algorithm into hardware. It was a PITA to fix I made a mistake, but it as fast as a PC clocked 50+ times faster.
I do remember interviewing for positions at defense/space contractors. It was odd to me how they were literally at least two generations behind in terms of semiconductor node tech. They apparently valued reliability and proven performance rather than having state of the art speed.