Is Progress In Autonomous Technology Gated By Research In Animal Communication?

Research in animal communications conjures up images of Jane Goodall studying the great apes in Tanzania.  Indeed, a great deal of research in the field is on higher level functions such as social interactions, animal cognition, or even emotional life.  However, underneath all the higher level functions, there is a much more basic elemental aspect of intelligence consisting of simply surviving in the physical world.   This capability seems to be so innate that there is little research on its basic functionality. What are the key characteristics of this capability ?

First, there is a large class of intelligence which is connected to perception of the physical future behaviour of other actors in the environment. Key ideas include:

  1. Focus: Through observation of eyes and sometimes ears, animals can interpret the direction of the other actors’ attention. 
  2. Body Positioning:  Sitting, walking, running postures are all interpreted.  Positions which can lead to a high degree of acceleration of the other actor are of special interest. In general, acceleration is a “hot button” metric.
  3. Gestures and Intent:  Facial as well as full body gestures are interpreted around the topic of bad or good intent.  

Second, there is a basic calculation of physics. No, animals are not solving Newton’s equations of motion. However, animals innately maintain balance, calculate interception trajectories, and manage the potential for threats to physically harm them. 

Third, animals maintain a basic virtual mental model of their surrounding environment,  and this model seems to generate an expectation which drives perception. The difference between perception and expectation combined with absolute distance seems to be central concept to drive behavior. For example, a “surprise” awareness of an unknown object in close proximity drives a highly visceral response. 

With this range of perception, mental modeling and non-verbal cross species communication, animals do “threat assessment(Autonomous Technology Term),”  and “path planning(Autonomous Technology Term).” Of course, humans have exactly the same capabilities in the lower levels of the human brain. Something like the Ellen Scares Guests video effectively demonstrates this principle. 

What does all of this have to do with Autonomous Vehicles in a well regulated transportation network ?

Research on all the accidents to date with AVs has shown that the vast majority of them are caused by humans hitting AVs in low-speed rear-end collisions.  Why do humans hit AVs at a higher rate than other humans ? It appears the answer is a miscalculation of the AVs future behaviour. 

It is the nature of human beings to anthropomorphize and we also do so with automobiles.   Humans interpret the micro-breaking, micro-accelerations, drift in the lane, and other factors in our own threat assessment of the situation. All of these nonverbal movements are a source of active communications for human beings. In addition, layered on top of this interpretation is more explicit non-verbal communication through eye-contact or hand gestures.  Overall, this creates a non-verbal language-of-driving  which effectively makes the whole cooperative transportation system work. 

When autonomous vehicles do not participate in this communication, they create a danger to the overall system.  AV researchers are just starting to look at aspects of this problem with a focus on the near term problems such as interpreting the potential movements of pedestrians at intersections. However, the fundamentals seem to be much deeper and broader. To be effective, it is likely that AVs will have to be able to interpret a broader language and this analysis may well have to extend to the behaviour of animals (deer, cats, dogs, etc) as well. After all, they “share” the road in residential settings. Note, this language-of-driving may well have regional dialects. The language of a Boston and Cincinnati driver has some variation.

“The primary and under appreciated challenge in the application of autonomy is understanding & exploiting  human/machine teaming,” says Ken Ford, CEO of the Institute for Human Machine Cognition (IHMC). 

With the world of Natural Language Processing (Alexa, etc), researchers  have built a reasonable understanding for spoken and written language. However, this much more basic form of communication and perception is just at its beginning stages of understanding. 

It seems the capability is so innate that we didn’t know we did it, until we had to recreate it in autonomous systems.

Speak Your Mind

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Get in Touch

350FansLike
100FollowersFollow
281FollowersFollow
150FollowersFollow

Recommend for You

Oh hi there 👋
It’s nice to meet you.

Subscribe and receive our weekly newsletter packed with awesome articles that really matters to you!

We don’t spam! Read our privacy policy for more info.

You might also like

Clay Collard, Boxing On Saturday’s Lomachenko Card, Is Excited...

PFL fighter Clay Collard Top Rank/Mikey Williams...

JPMorgan, Wells Fargo and Morgan Stanley to boost dividends...

Jamie Dimon, CEO, JP Morgan Chase, during Jim Cramer interview, Feb. 23, 2023.CNBCLarge U.S...

Dr. Fauci says it appears Covid strain from Danish...

Director of the National Institute of Allergy and Infectious Diseases, Anthony Fauci, testifies during...

How To Quickly Scale Sales On Amazon

If you sell a physical product, it’s high time you sell via Amazon. The...