Charlie Stross did an interesting talk to the Chaos Computer Club about the first generation of AIs that is wrecking the plane - Dude, you broke the future!.
History gives us the perspective to see what went wrong in the past, and to look for patterns, and check whether those patterns apply to the present and near future. And looking in particular at the history of the past 200-400 years—the age of increasingly rapid change—one glaringly obvious deviation from the norm of the preceding three thousand centuries—is the development of Artificial Intelligence, which happened no earlier than 1553 and no later than 1844.
I'm talking about the very old, very slow AIs we call corporations, of course. What lessons from the history of the company can we draw that tell us about the likely behaviour of the type of artificial intelligence we are all interested in today? ...
What do our current, actually-existing AI overlords want?
Elon Musk—who I believe you have all heard of—has an obsessive fear of one particular hazard of artificial intelligence—which he conceives of as being a piece of software that functions like a brain-in-a-box)—namely, the paperclip maximizer. A paperclip maximizer is a term of art for a goal-seeking AI that has a single priority, for example maximizing the number of paperclips in the universe. The paperclip maximizer is able to improve itself in pursuit of that goal but has no ability to vary its goal, so it will ultimately attempt to convert all the metallic elements in the solar system into paperclips, even if this is obviously detrimental to the wellbeing of the humans who designed it.
Unfortunately, Musk isn't paying enough attention. Consider his own companies. Tesla is a battery maximizer—an electric car is a battery with wheels and seats. SpaceX is an orbital payload maximizer, driving down the cost of space launches in order to encourage more sales for the service it provides. Solar City is a photovoltaic panel maximizer. And so on. All three of Musk's very own slow AIs are based on an architecture that is designed to maximize return on shareholder investment, even if by doing so they cook the planet the shareholders have to live on. ...
The problem with corporations is that despite their overt goals—whether they make electric vehicles or beer or sell life insurance policies—they are all subject to instrumental convergence insofar as they all have a common implicit paperclip-maximizer goal: to generate revenue. If they don't make money, they are eaten by a bigger predator or they go bust. Making money is an instrumental goal—it's as vital to them as breathing is for us mammals, and without pursuing it they will fail to achieve their final goal, whatever it may be. Corporations generally pursue their instrumental goals—notably maximizing revenue—as a side-effect of the pursuit of their overt goal. But sometimes they try instead to manipulate the regulatory environment they operate in, to ensure that money flows towards them regardless.
Human tool-making culture has become increasingly complicated over time. New technologies always come with an implicit political agenda that seeks to extend its use, governments react by legislating to control the technologies, and sometimes we end up with industries indulging in legal duels.
For example, consider the automobile. You can't have mass automobile transport without gas stations and fuel distribution pipelines. These in turn require access to whoever owns the land the oil is extracted from—and before you know it, you end up with a permanent occupation force in Iraq and a client dictatorship in Saudi Arabia. Closer to home, automobiles imply jaywalking laws and drink-driving laws. They affect town planning regulations and encourage suburban sprawl, the construction of human infrastructure on the scale required by automobiles, not pedestrians. This in turn is bad for competing transport technologies like buses or trams (which work best in cities with a high population density).
To get these laws in place, providing an environment conducive to doing business, corporations spend money on political lobbyists—and, when they can get away with it, on bribes. Bribery need not be blatant, of course. For example, the reforms of the British railway network in the 1960s dismembered many branch services and coincided with a surge in road building and automobile sales. These reforms were orchestrated by Transport Minister Ernest Marples, who was purely a politician. However, Marples accumulated a considerable personal fortune during this time by owning shares in a motorway construction corporation. (So, no conflict of interest there!)
The automobile industry in isolation isn't a pure paperclip maximizer. But if you look at it in conjunction with the fossil fuel industries, the road-construction industry, the accident insurance industry, and so on, you begin to see the outline of a paperclip maximizing ecosystem that invades far-flung lands and grinds up and kills around one and a quarter million people per year—that's the global death toll from automobile accidents according to the world health organization: it rivals the first world war on an ongoing basis—as side-effects of its drive to sell you a new car.