Machines could soon be learning a new language by observing the world like babies do if the Pentagon’s Defense Advanced Research Projects Agency (DARPA) has its way, Defense One reports.
“Children learn to decipher which aspects of an observed scenario relate to the different words in the message from a tiny fraction of the examples that [machine-learning] systems require,” officials with the Defense Advanced Research Projects Agency wrote in officials of the DARPA wrote in a government solicitation bid package for the Grounded Artificial Intelligence Language Acquisition (GAILA) project.
“ML technology is also brittle, incapable of dealing with new data sources, topics, media, and vocabularies. These weaknesses of ML as applied to natural language are due to exclusive reliance on the statistical aspects of language, with no regard for its meaning.”
The project, which is eligible for up to $1 million in funding, seeks an artificial intelligence prototype that can learn a language in much the way as a young child does – from visual and auditory clues.
GAILA will use visual cues to describe what it experiences before, during and after an event.
DARPA has an AI Exploration program that helps fund a variety of different approaches to improving AI, which allows DARPA to “go after some of the more high-risk, uncertain spaces quickly to find out whether they’re on the critical path toward reaching our ultimate vision.”