Saturday 17 January 2015

We're Not Ready For This: A.I.

I saw this the other day:

He goes over deep learning, self-directed computer intelligence for the first fifteen minutes or so and summarizes at about 17:00 minutes.  The social implications of deep machine learning are quite profound.

Here are some other artificial intelligence related media that you might want to peruse:

Ray Kurzweil's The Singularity is Near (a long and tedious mathematical read with some wonderful implications mixed in.)

Her, Spike Jonze' ode to A.I..

A lot of Hollywood A.I. talk falls short into HAL type horror, but this one doesn't, it goes all the way.  By the end you'll be questioning our short comings rather than fearing what a superior intelligence might do.  I wonder what Kurzweil thought of the A.I. in this film and what it ends up doing.


Better education doesn't help? Work is irrelevant? What do we do
in a world of human pets that serve no real function in terms of survival?
This could be an age of unprecedented creativity, or the beginning of the end.
The TED talk has an interesting moment in those final two minutes where Howard is talking about the social implications of an imminent (the next five years!) machine intelligence revolution.  He talks about computers taking over jobs that we consider to be human-only and doing them better than people ever could.  This isn't about coding a better piece of software, it's about computers coding themselves in a never ending cycle of improvement.  It's also about people no longer having to be responsible for their own survival decisions.


What happens to insurance companies when automotive accidents are a thing of the past?  Accidents don't happen when the A.I. managing it can not only control the car in question, but also move the entire traffic jam up ten feet to avoid accidents.  This is often misunderstood as people say that A.I. driven vehicles could have bad code that causes a massive pile up.  These aren't machines running code, these are machines that create code as they need it, kind of like people do, but much faster, and with absolute precision.  And however well they do it now, they'll do it better tomorrow.


What happens to human beings when they are no longer
responsible for their own survival?
The busy truck driver still needs to sleep, what replaces him won't.  It'll never drive tired or hungry or angry or distracted either.  It'll only ever use the least amount of gas to get where it's going.  One of the tricky things about trying to grasp human superior A.I. is in trying to envisage all the ways that it would be superior.  That superior A.I. would never stop improving, it would take over any concept of efficiency in business.

As Howard says, machines that are able to build machines in a continuously improving manner are going to make the social change caused by the industrial revolution look like a blip on the radar.

Perhaps the hardest implication of a machine intelligence revolution is the idea that your income is tied to your usefulness.  Our entire society is predicated on the idea that your income somehow reflects your usefulness.  If human usefulness is no longer tied to social status, what would society look like?

During the big market bailouts in 2008 someone online described business as the cockroaches that feed off the work of human society.  He suggested that you don't feed them steak, you just let them thrive on the waste.  The implication was that capitalism is a necessary evil that serves human beings, not the other way around as it's often stated (people are a necessary evil in capitalism).

The idea that people could be free to pursue their own excellence in the future without having to work for the cockroaches is quite thrilling, though it would require a huge jump in social maturity for human beings.  We'd have to begin identifying our own self worth through our own actions rather than our education and employment.  I suspect most people aren't close to that.  We'd also have to recognize that everyone has a unique and valuable place in society, which sounds like socialism!

Education is as guilty as any social construction in aiming children towards the idea of success being employability and income.  We stream students according to their intellectual capital and then tell them to work hard in order to achieve financial success in the future.  The very idea of effort is tied to financial success - something we'd have to change in a machine intelligent future.  Can humans value themselves and seek excellence without the yoke of survival hung around their necks?

Universal income is an idea being floated in Switzerland and elsewhere.  If the future is one where people are no longer integral to their own survival, we better find something other than a survival instinct to base our self value on, or we're going to quickly run out of reasons for being.