Historically, new technologies are often developed as proprietary, and companies try, if possible, to reach a monopoly (a reason why patents grant “monopoly” on an invention for 10 or 15 years before the patent becomes released to the public — the idea is to protect a creator for a little while to do their business without fear of having their idea copied by competitors). Typical examples include common things like electrical power in the latter part of the 19th century, or petrol for your car. In those days, you had to use devices with special plugs and sockets on your home to use electricity from a particular provider; and cars would just use special patrol for a specific brand.
Quickly, however, manufacturers of electrical devices saw that they would expand their market if all power companies used similar standards; and consumers, obviously, would be happy to know that their appliances would still work even if they changed providers. Imagine a world where Ford’s petrol wouldn’t work on Nissan or BMW cars.
This lead to the emergence of industry standards. In some areas (petrol being one of them), Governments enforce the industry standards; in others it’s simply the market that self-regulates. As said, consumers will opt for standard solutions that will work with a variety of solution providers, if they get that choice, as opposed to solutions that tie them to a single provider (who might overcharge for the monopoly; or become bankrupt and thus leave their customers with appliances that don’t work any longer).
The notion that companies cooperate with their competitors to develop common standards is the hallmark of the industrialisation in the 20th century — as opposed to the 19th century, where each manufacturer had (mostly) their own standards. The idea that you can plug any phone to your land line and it will work made phone communications ubiquitous. In fact, most technologies that became ubiquitous are all based on industry standards.
Strangely, though, the computer world was not very keen at the very beginning to adopt many “standards”. For instance, it was not until 1957 that there was a programming language available for more than one brand of computer — the market was so small that the industry didn’t think there was any advantage of standardising computer languages across platforms. However, this language (IBM’s Fortran) enabled the early software programmers to develop their software in a single programming language and deploy it on any number of computers. The idea quickly caught on: instead of retraining your programmers for different languages every time a computer model was launched, they could just learn one language well, and software written from one computer could be brought to the next model — or even a different brand — without having to start from scratch.
The problem back then was simply who would guarantee that Fortran would remain independent from any vendor. The solution, like on all the rest of the industry, was to submit the language’s specifications to an independent standardisation board (in Fortran’s case, ANSI).