"Standards" aren't always better. Often, it's flawed standards that give rise to proprietary.
Funny way of thinking...
Standards are better when things need to interoperate.
At the time PC layout was ordained, there was a lot of room for electro-physical advances. Well progress marches and all the headroom for improvement has been used up. The $5000 PC is a performance giant with a computational capability exceeding what cost $50,000,000 to deploy to a national lab in 1999.
Tera scale computing went to exa-scale in 25 years. That's a scale factor ranging from 1,000 to 1,000,000. It's difficult to compare because the architectures are so different.
The time has come where sheer physics improvements have slowed down and advancing the architecture is appropriate.
Most notably, what used to be a collection of upgradable PC modules where users had a cost interest in minor upgrades to climb the ladder of change, can now be managed as one replaceable unit, e.g., the NUC or game console.
These more tightly integrated form-factors can be more reliable, and are as cheap as individual peripheral upgrades were to the previous era of IBM AT PC.
When you want to make an architecture change, this disrupts the interdependence of system layers. Vertically integrated products are by their nature be managed to promote the advantages of such change. Horizontally integrated products must deal with disruption.
The PC industry will muddle its way through: it's a much more diverse and richer industry than Apple, but it has lost any masthead brand to help us understand what a PC means as a commodity, while Apple is a mega-brand. This causes the appearance that PCs are lagging, but they're not, it's just becoming unclear to buyers what if anything offers new value.
I think much bigger changes than Apple is making right now are required and possible. But they aren't likely to be made for the same sorts of reasons that no progress in being made on the ecological crisis: it's not in industries general interest to change.
I don't see what "standardization" means as a limit to Intel and Microsift: they can do anything they want, they just want to keep making money the same way as before.
Something that mystifies me about Musk taking over Twitter is how he summarily disposed of a valuable cadre of elite web staff as if they were garbage in order to do debt service on a deal he didn't even want. What a dick. An innovator would have an idea how to repurpose that incredible wealth into a new inflection point for the web. But like most dim-witted captains of industry he just looked at the balance sheet and flipped the switch on the shredder. Dumb beyond all reckoning. Unless maybe he absorbs them into another sector?