The Artificial intelligence chat will take place on 11 May at 11:00. You will be able to ask your questions to Nuria Oliver, Scientific Director of the Multimedia, HCI and Data Mining & User Modeling Research Areas in Telefonica Research.
Users can only sign up to the chat if they are registered members of the inGenious Teacher Community. Once being logged in in the Teacher Community, members can sign up for the chat in the event section.
What is AI?
In order to talk about Artificial Intelligence, we need to think first about Human Intelligence. One of the key elements of human intelligence is the ability to learn and generalize from experience. Since we are born, we interact with the world that surrounds us and learn through our interactions with it. For example, a small child might be fascinated about opening and closing drawers, repeating this action many times. In one of the times, she might leave her fingers inside the drawer when closing it, hurting herself. Next time, she will probably be more careful when closing the drawer and make sure that her fingers are outside the drawer before closing it. This ability to process data from the external world, learn from experience and make appropriate decisions is key to human intelligence. What seems trivial to us, it is, however, extremely hard to replicate in a computer.
Artificial Intelligence or Machine Intelligence is the branch of science and engineering aimed at "making a machine behave in ways that would be called intelligent if a human were so behaving". This definition was put forth by John McCarthy in his 1955 Proposal for the Dartmouth Summer Research Project On Artificial Intelligence.
What do we need to build an intelligent machine? We need the ability to collect data from previous experiences and algorithms that are able to analyze this data and identify patterns that are applicable to understand and predict future experiences.
AI research started after World War II, when a number of researchers independently started to work on intelligent machines. The English mathematician Alan Turing may have been the first. He gave a famous lecture on it in 1947. He also may have been the first to decide that AI was best researched by programming computers rather than by building machines. By the late 1950s, there were many researchers on AI, and most of them were basing their work on programming computers, including pioneers as Allen Newell and Herbert Simon, who founded the first artificial intelligence laboratory at Carnegie-Mellon University, and McCarthy and Minsky, who founded the MIT AI Lab in 1959. They all attended the Dartmouth College summer AI conference in 1956, which was organized by McCarthy, Minsky, and Nathan Rochester of IBM.
I had the pleasure of meeting and working with Minsky when I did my PhD at the MIT Media Lab from 1995 to 2000.
Turing Test
Alan Turing's 1950 article Computing Machinery and Intelligence discussed conditions for considering a machine to be intelligent. He argued that if the machine could successfully pretend to be human such that a knowledgeable observer could not distinguish between the machine and a real human, then you certainly should consider it intelligent. In order to carry out the Turing Test, a person is kept inside a room and asked to interact (by teletype to avoid requiring that the machine looks and talks like a human) to both a hidden human being and a computer. The person, who is also called the interrogator (one who questions), does not have any clue about who is the human being and who the machine. His task is to find out which of the two candidates is the computer, and which is the human by asking them questions. If the interrogator is unable to decide within a certain time, the machine is considered intelligent. However, there has been no machine till date which has been able to pass the Turing's test. This remains one of the biggest challenges of modern day computing.
A machine could still be considered intelligent without knowing enough about humans to imitate a human.
Data-driven AI
One of the first computer programs that were created to allow machines to learn from data is the neural network. A neural network is a computer program that mimics the functioning of the brain. It can be taught to recognize patterns from examples. In fact, when it is trained, it can classify and identify patterns in large amounts of data, such as performing automatic character recognition (OCR). It can do all this at very high speeds and sometimes faster than humans.
Machine learning is a discipline within artificial intelligence that focuses on developing algorithms that enable computers to learn from data. Since the first neural network was invented, scientists have proposed and developed many new approaches to machine learning. Today there are numerous approaches to achieve this goal, depending on the complexity of the problem, the type of available data, the goals of the system, etc…
Even though progress towards the ultimate goal of human-like intelligence has been slow, many spinoffs have come in the process. All of you most likely interact with some AI system on a daily basis, many times without being aware of it, from the spell checker and autocorrect system on your mobile phone to the Kinect XBOX video console. Some examples include:
- - Deep Blue, a chess-playing computer, beat Garry Kasparov in a famous match in 1996.
- - Expert systems are being used to industrially. Today there are some specialized areas where "expert systems" are used to augment or to replace professional judgment in specific use cases of engineering and of medicine.
- - Machine translation systems such as the one at Google Translate are widely used, although results are not yet comparable with human translators.
- - Handwriting recognition is used in millions of personal digital assistants.
- - Speech recognition is commercially available and is widely deployed.
- - Computer vision systems are used in many industrial applications, in the XBOX Kinect .
AI in the Movies
There have been numerous movies depicting intelligent machines, including 2001 A Space Odissey (HAL 9000 computer), AI Artificial Intelligence, Minority Report, Wallie, StarWars, etc….