Adapted from The Future of You (2014) by Lesley Scott

http://www.interestingtopics.net/creator-of-artificial-intelligence-id-294
http://www.interestingtopics.net/creator-of-artificial-intelligence-id-294

When Google acquired eight robotics companies in the second half of 2013, eyebrows raised – particularly when they bought the maker of futuristic military-grade robots. [i]  And when the search giant continued hiring expert after expert in the field of artificial intelligence (AI), tongues wagged. And they recently sent the rumor mill into overdrive by purchasing the secretive, London-based DeepMind Technologies that trains machines to learn the way we do.[ii]

 

 

 

All signs point to Google endowing machines with serious mental superpowers. Not that the road to machine enlightenment has been bump-free so far. If you’ve ever been foiled by autocorrect, scratched your head at Siri or wondered how a routine Google image search can produce such random and awful results, then you’ve come up against the current limitations of what machines can be taught to learn. Machine learning, however, is starting to become a lot more learned thanks to new algorithms. Lower-level concepts ground understanding of higher-level concepts, creating multiple layers of learning at different levels of abstraction.[iii] This deep learning is based on the way human nervous systems and brains recognize patterns and process information. During our very earliest stages of development, the area of the brain that processes sounds can also handle eyesight,[iv] an intriguing “general purpose” idea that is being modeled mathematically and used to design man-made or artificial neural networks.[v] Neural networks are generally so computationally costly that they have been limited to about 10 million connections.  However, Andrew Ng – Director of the Stanford Artificial Intelligence Lab – suspected that training larger networks would yield better results. So he used the power of his gig at Google to commandeer 16,000 CPU cores and increase the network of computations to more than 1 billion.

 

The resulting network, Google Brain, is not unlike a newborn’s brain, just smaller in scale (to compare, an adult human brain has about 100 trillion connections). And when the e-infant was exposed to a week of YouTube, what do you think it learned? “Indeed, to our amusement,” note Ng and his collaborator Jeff Dean, “one of our artificial neurons learned to respond strongly to pictures of… cats.”[vi] The network had not been told prior what a cat was nor given a single image labeled cat. Perusing a gazillion YouTube stills was how it discovered by itself what a cat is. It’s an amazing example of self-taught, deep learning…to say nothing of the mysterious power of kittens on YouTube. (For cats of the two-legged kind, there’s now even an algorithm that can determine whether your urban subculture is biker, surfer, punk, goth or hipster.)[vii]

 

http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/watson-ai-crushes-humans-in-second-round-of-jeopardy
http://spectrum.ieee.org/automaton/robotics/artificial-intelligence/watson-ai-crushes-humans-in-second-round-of-jeopardy

Learning also encompasses natural language, a characteristic that Google’s Director of Engineering, Ray Kurzweil, predicts search engines will be based on within five years.[viii] The current state-of-the-art was showcased on prime time television in the form of Watson, IBM’s incredibly powerful, interactive and artificially-intelligent computer system capable of answering questions posed in natural language.  It famously won the $1 million prize on Jeopardy!, trouncing the human talent – soundly.  It’s knowledge bank was acquired from perusing Wikipedia and more than 200 million pages worth of natural language docs. In the game show’s category Rhyme Time, when presented with the clue: A long tiresome speech delivered by a frothy pie topping, Watson was able to respond correctly What is a meringue harangue?

 

Deep learning is going to be the site of some serious next-level stuff that will dramatically shape how we interact with our world and the business environment in which we operate, especially given the amazing brainpower and technology being recruited:

 

  • Yahoo acquired a startup known for its deep learning image recognition tech called LookFlow, which will become part of Flickr.[ix]
  • Facebook recently hired deep learning expert Yann LeCun of New York University.[x]
  • The Google Brain team now includes professor Geoffrey Hinton, a deep learning giant in the field who revolutionized speech recognition and translation.[xi] Of the estimated 50 experts worldwide in deep learning, about a dozen work for DeepMind, Google’s latest acquisition. “I think this is the main reason that Google bought DeepMind,” opines Yoshua Bengio, an AI researcher at University of Montreal. “It has one of the largest concentrations of deep learning experts.”[xii]

 

[su_spoiler title=”References”]i http://www.cbsnews.com/news/google-buys-8-robotics-companies-in-6-months-why/
ii http://www.businessweek.com/articles/2014-01-27/the-race-to-buy-the-human-brains-behind-deep-learning-machines
iii http://en.wikipedia.org/wiki/Deep_learning
iv http://www.wired.com/wiredenterprise/2013/05/neuro-artificial-intelligence/
v http://en.wikipedia.org/wiki/Artificial_neural_network
vi http://googleblog.blogspot.com/2012/06/using-large-scale-brain-simulations-for.html
vii http://www.wired.co.uk/news/archive/2013-12/16/urban-tribes-algorithm
viii http://edition.cnn.com/2013/12/10/business/ray-kurzweil-future-of-human-life/
ix http://techcrunch.com/2013/10/23/yahoo-acquires-startup-lookflow-to-work-on-flickr-and-deep-learning/
x http://www.businessinsider.com/zuckerberg-hires-deep-learning-professor-2013-12
xi http://www.wired.com/wiredenterprise/2013/03/google_hinton/
xii http://www.technologyreview.com/news/524026/is-google-cornering-the-market-on-deep-learning/ [/su_spoiler]

Lesley Scott
Lesley Scott is editor of Fashiontribes.com, a top longtime indie blog about style & author of The Future of You (2014). She is an eccentric 5.0 Fashion Futurist who uses the lens of the fashion tribes to grok the deep trends and hot technologies shaping pop culture today & tomorrow. She is also a Channel Curator for Sulia.com and a regular contributor to Answers.com. As host of the offbeat Fashiontribes Daily podcast, she delves into everything from the intersection of biology & fashion, to the Singularity & transhumanism, cyborgs, bionics, the aggressive rise of crafts and handmade stuff, futuristic bling...and just enough zombies to keep things interesting.