Fortunately, Windows has always had a built-in safeguard. Even after that, not all resolutions were appropriate for all monitors, and you could (and often did) get garbage or a black screen if you tried to use them. Higher resolution displays were the job of the graphics cards manufacturers, and each card came with a disk full of drivers. It was a 'lowest common denominator' approach that guaranteed that people installing Windows would at least be able to see what was on the screen from the get-go. In years gone by, the choice of screen resolution was almost always entirely up to a trial and error process led by the user, since Windows defaulted to one or two low-resolution screens.
Setting the screen resolution is more automatic than it used to be