When TB was introduced, its performance potentials seemed pretty great, like omg (fantasy) PCI bus moves outside the box with bold new config possibilities for heterogeneous device clusters.
Apple seemed to me to implicitly appeal to such a fantasy by consolidating their small-system IO into a common port, making their designs seem svelte on one hand and expandable on the other. Simple and flexible. Seeing thr display connection unified with the USB connection egged on (unconsciously to me) my fantasy because the most important and demanding consumer PCIe device is the display adapter and seeing the hybrid emerge seemed cool.
But the payoff ultimately seems blasée. Look at the chart above, and surmise TB evolution with PCIe evolution. Duh. By simple traits of physics, a long-wire external bus is never gonna keep up with its internal short-wire parallel parent design.
But the chart also shows something that maybe should have been predicted, but is also an unexpected development: PCIe gen 3 stagnated on Intel for 11 years whereas the projected pace of evolution was more like 3.5 per generation.
The mess which is TB4 today is still about PCIe x4 for most users due to this stagnation.
OTOH PCIe gen 3 x4 I/O seems like nothing to sneeze at, a good middle ground for consumer fare.
Yet the projection was TB would be PCIe x1 by this date? Why did it stagnate? The designers must know all this.
PC gaming gfx was not a harbinger of UI evolution. Putting xparency and zoom fx in UI is not challenging compared to 3d first person open world gaming. The killer app for consumer gfx turned out to be video decode for increasing display resolutions. This was developed in concert with the big entertainment companies and doesn't feel like a killer app to consumers 'cause generations see TV as a god-given right and the PC replaced the TV — literally, now every TV has a PC-thing inside of it. This is very important evolution, but also boring b/c its a pure engineering matter which can be solved and budgeted, per Intel iGPU.
So Thunderbolt sort of looked to me like it was breaking out I/O for about 5 years and this seemed vaguely promising.
Yet architecturally it's lead to almost nothing of interest. USB keeps up with it in practice. NVMe storage can use its throughput, but somehow this doesn't translate into any capability that feels like a break-through for peripheral storage; it's not enabling anything that wouldn't otherwise be possible except for shuffling hi-res web video to an external drive from minimally provisioned base-system drives. And this brings me back to TV again (what does web vid mean for most of us? incessant contact with cam-whores of infinite stripes).
As aside, NVMe while being solidly 5x faster than SATA does not make any PC feel 5x faster, it just keeps the UI perky enough to feel like its not standing in your way. Meanehile, UIs feel completely insane since Web 2.0 but like crappy TV shoes, everyone enjoys the novelty.
Gaming is less interesting for tech than its fan-base would make you think — or as just more cam-whores does make me think. It has been perfected to the point that streaming is eclipsing dedicated gaming rig, except for the Xtremesports set. GPUs are in short supply, but not because of UI or content creation or gaming revolution, but to feed vast Ponzi scheme of crypto-asset production.
Moreover, we are in this paradoxical passage of history where PC HW has somehow advanced even beyond the capacity of SW bloat, yet we still feel we don't have enough. Is there any incredible new tech less compelling than 8K displays?!
Bsck to TB: the long wires aren't that long, and they cost a fortune. Today it costs an additional $300-400 US to connect a pair of displays and a backup drive to a $1200 macbook via a Caldigit TB4 hub, and the Macbook doesn't support more than one ext display. Cables are insanely expensive. You can buy a fully provisioned NUC with Windows for same $$ as TB expansion
At same time a dGPU can be connected to a mini via TB but the iGPU is engineered to run most apps.
High-end vid producers will buy dedicated workststions a la MacPro. Gamers are on PC.
As a modern hackintosher I am keenly aware failed promise of mature TB and the tedium of even thinking about it, because it obviously does almost nothing that isn't covered by pre-existing tech. (special props to Audio nerds who are a special herd of cats who pray at an alter of sensory rapture contrary to everything that vid production gurus have to teach about system configuration.) We already have USB and display ports. If you have a desktop, you have PCIe for nerding out. And if you have a non-Apple portable... Wait, why do you have a non-Apple portable but want to run MacOS? By nature you are a haploid whose PC genetic inheritance has incurred upon you a perverse need for pain.
So TB amounts to a very expensive display port cable. And the promise of TB is delivered as a fashionably minimal connector arrangements on Apple devices sold to a market of sexy vloggers. The end result being the cultural triumph of the anonymous youtube front page and Qanon.
At one time a long time ago there was hope that TB-like interconnect would commodify explicitly parallel heterogenious multicomputer architecture to enable very flexible and powerful systems to be scaled out of inexpensive modules. Curiously, it was discovered that no-one really cares about such a holy-grail of computing. The future has always preferred just packing more logic into the chips.
It now completely clear TB is Apple-specific value-add, and with TB4 it looks very close to being an Apple-branded feature where there is reason to be concerned that TB peripherals will all be vendor specific: Is that an Apple TB or a Dell TB? It's not a super big deal; this has always been true. But when Apple was making its kit from PC parts there was a crossover between worlds which created consumer possibilities. Now its gonna be Apple and MSFT fighting over the entire stack again top to bottom, like the dayz when that Pepsi idiot took over Apple.
Does either company have a vision for the future beyond stuffing their markets in what will be doomed to be the caskets called their "stores"?
Linux?
I've never had less hope in the promise of personal computing.