There's this notion of a technological cycle of reincarnation, where an implementation is followed by a logical successor which leads to another one which carries on down a chain until someone has a bright idea which looks an awful lot like the original implementation. For example, it used to be that CPUs did all their math by themselves, then they outsourced the hard parts to floating point coprocessors, then those got built in to the main CPU chip again. Graphics subsystems are particularly prone to the cycle.
It just occurred to me that with the recent trend to enabling advanced graphics systems for laptops and mini-systems by extending the PCIe bus out of the chassis via Thunderbolt or USB3.1 or such, it's time for someone to take the cycle in a new direction:
It's time for monitors to integrate graphics processors.
Right now you might connect your monitor with VGA, DVI (variants -I and -D), HDMI (in several generations of backwards compatibility) or DisplayPort (several generations). And monitors are, relatively speaking, much cheaper than they used to be. In the 1990s we thought that $600 was not too high a price to pay for a high-end 16" CRT. Now $400 will buy you a 40" 4K LCD.
Why not just use Thunderbolt 3 or USB 3.1 or something similar and sell a video card in the back of the monitor, perfectly matched to that monitor's capabilities? You get to push the power supply and cooling requirements outside of the computer.