Experts in animal cognition may be the AI industry’s secret weapon.
What's happening: Tech giants like Apple and Google have added neuroscientists studying rodents, birds, and fish to teams working on voice processing, sound recognition, and navigation, according to a story in Bloomberg Businessweek.
Farm team: Tech companies have been poaching talent from Frédéric Theunissen’s UC Berkeley Auditory Science Lab, where researchers combine animal behavior, human psychophysics, sensory neurophysiology, and theoretical and computational neuroscience:
- Channing Moore earned his doctorate in biophysics. He joined Apple as an algorithms research engineer. Now at Google, he applies his background in bird song to teaching sound recognition systems to distinguish similar noises like a siren and a baby’s wail.
- Tyler Lee’s work with birds led to a job as a deep learning scientist at Intel, where he’s helping improve voice processing systems.
- Chris Fry went from studying the auditory cortex of finches to coding a natural language processor at a startup. That led to positions at Salesforce, Braintree, and Twitter before he decamped to Medium.
Opening the zoo: Bloomberg mentions a number of ways animal cognition is influencing AI research:
- Zebra finch brains can pick out the song of their own species amid a cluttered sonic backdrop. Understanding how could help voiceprint security systems recognize people.
- Zebra fish (not finch) brains switch between predatory maneuvers and high-speed, straight-line swimming. Their agility could help autonomous vehicles sharpen their navigational skills.
- Understanding how mouse brains compensate for unexpected changes in their environment could help engineers improve robot dexterity.
We’re thinking: Human-like cognition is a longstanding AI goal, but certain tasks don’t require that level of complexity. It’s not hard to imagine the lessons that rats running mazes might teach autonomous vehicles. And besides, who hasn’t felt like a caged animal during rush hour?