Edit This Guide Record
Guides Technology Artificial Intelligence

Artificial Intelligence

Published on 08/01/2016 | Technology

278 3

IoT GUIDE

AI Champions

AI gained impetus back in 1997 when the IBM mainframe Deep Blue defeated world chess champion Garry Kasperov in a portentous match-up. More recently, a descendant of Deep Blue named Watson, defeated the top two Jeopardy champions in 2011 again highlighting AI capabilities.

The ancient Chinese board game Go is considered more complex than chess with more alternative potential moves to consider. Google DeepMind, a London-based AI company acquired by Google in 2014, developed AlphaGo, AI software capable of learning and improving performance in Go game challenges. Earlier this year, AlphaGo defeated 18-time world Go champion Lee Sedol. DeepMind did more than simple number crunching, to win required advanced strategy and tactics. While winning a game of Go falls short of the Turing test , rapid advancements in AI capability are being made.

What Differentiates AI?

So what exactly is AI and how does it differ from any other code? AI is far more than software comprising a number of if-then statements. Ability to learn and improve is a characteristic of AI. Some therefore classify techniques such as blacklisting, heuristics or word prediction – all with the ability to “learn” – as simple forms of AI. Blacklisting is used for solutions such as anti-phishing and anti-virus, heuristics is also widely used in the security industry, and the latest mobile keypads can now predict the next word based on the user’s past use.

Genuine AI entails software mimicking human intelligence, emulating the process of the human mind. Biological intelligence is achieved through the functioning of around 100 billion neurons in the human brain. Brain neurons are interconnected by synapses. Neurons are triggered and decisions are made from the input of thousands of synapses.

Artificial neural networks (ANNs) emulate the process of human brain functioning. Interconnected nodes (emulating brain neurons) form a network and communicate with one another. Connections between nodes have numeric weights which vary as the network learns from experience. Greater weights imply greater connection strengths between nodes. When a node weight exceeds a threshold it triggers an output. ANNs can be single-layer or multiple-layer. ANNs learn through a training function which alters synaptic weights via an iterative process until the outputs are as expected. ANNs are useful for “fuzzy” solutions such as weather prediction.

Keeping the Reins on AI

The potential of AI is so enormous that many have warned of the need to be wary of its consequences. Stephen Hawking cautioned that AI could one day lead to the end of mankind. Humans change fundamentally through learning and through evolution, a slow process. However, an AI machine could learn and adapt far quicker, leading to it becoming far smarter than us. Elon Musk, Bill Gates and others have also issued dire warnings of potential threats from AI getting too clever for our own good.

In the 1968 science fiction film classic 2001: A Space Odyssey, the computer Hal 9000 tries to kill the crew of the space craft Discovery One. What has up to now been only in the realm of science fiction, may yet become a more realistic threat. Today, various military weapons of destruction are hooked up to computing functions. Not only is the capability of weapons’ computing power constantly expanding, but there are also pressures to make weapons more autonomous. While our country may view the linking of fully autonomous AI with weapons as myopic, who knows what another bellicose group may do. Perhaps keeping the reins on AI will be a vital security challenge at some point in the future.

test test