A.I. for Movies
Computer Scientists Role
A.I for Games
Artificial Intelligence has been a main focal point in game design for several years. Smarter AI has been a feature that many gamers have looked for in new games. There is a growing desire for enemies that can work together and learn as they fight. It is through these desires that computer scientists have started to make developments in these areas.
Computer Scientists Role
Artifical intelligence relies on an intelligent decision system. There are severeral decision systems that have been applied in the past the most prevelant are Fuzzy Logic, developed by Zadeh in 1973, artifical neural networks (ANNs)and more recently evolutionary computation has become a powerful tool.
Fuzzy Logic is a mathmatical approach to solving a problem rather than a Boolean (True/False) approach. This approach to solving problems is similar to the approach we use.
For example: If you had a group of 50 individuals and you wanted to classify them as old or young you can see how the two approaches differ. In the Boolean approach anyone over a defined age, lets say 45,would be classified as old and everyone else would be young. So someone with the age 44.9 would be classified as young but at 45.0 they would become old.
Fuzzy Logic works on a sloping line. So it could say that everyone up to 40 would be considered absolutely young but from ages to 40 to 50 they would increasingly be considered more old. This comes out to define people as somewhat old or somewhat young in those years in which thier age/oldness is fuzzy.
Fuzzy Logic is used to determine a conclusion based on vague and noisey information that would be indeterminable by traditional problem solving methods. It is used to mimic human control of an application. By using a descriptive yet impercise language it deals with the data much the way a human controller would. Though it is inpercise it is still forgiving of data input and rarely needs much fine tuning.
Examples of Fuzzy Logic used for movie A.I. includes Massive Software Massive Software which was used in the Lord of the Rings Movies as well as the Chronicles of Narnia
Artificial Neural Networks
Artificial Neural networks, sometimes refered to a just Neural Networks is a decision making system that was inspired by the nervous systems of animals. It works by having several processing systems working together to solve a problem rather than just one. These networks, just like humans, learn by example.
The first Artificial Neuron was created by Warren McCulloch, a neurophysiologist, and Walter Pits, a logician, in 1943. However it was limited by the technology of the time so it was unable to do much. In 1949 Donald Hebb brought to light the importance that nerual pathways are improved with each use. This becomes essential because of its similarity to the way in which humans learn. "MADALINE" was developed in 1959 by Windrom and Hoff of Stanford University. It was the the first nueral network applied to a real world problem, eliminating the echoes on phone lines. This system is still in commercial use.
The early successes of the neural networds coaused an exaggeration of the potential, such as a computer that could program itself. This led to a fear of the "thinking machine" which is still felt today, though even with all the advances in technology we are far from achieving the "thinking machines" we so fear.
In 1982 there were 3 major breakthroughs in ANNs. The first was John Hopfield whose proposition reneverd interest in the field. Early systems only used a one way connection between neurons but Hopfield proposed using a bidirectional connection between the lines which he believed would be able to create more useful machines. Also that year Reilly and Cooper craeted a multiple layer network. Each of the layers used a different strategy of problem solving. This they called a "Hybrid network."