
Overview
While research indicates that many animals communicate primarily through gestures, our ability to understand and interpret these signals remains limited. Imagine if we could use AI or models to translate animal body language into human-understandable messages. Such a breakthrough could revolutionize fields like animal behaviour research, wildlife conservation, robotics and even pet care—but achieving this is no simple task.
Challenge
Develop a conceptual model or AI-based system that aims to decode and interpret animal gestures using images, real or simulated. Your solution should explore how AI or a model could identify patterns in animal movements and propose potential meanings behind these gestures.
Data: Use existing datasets or create a simulated dataset (e.g., annotated images or drawings) of animal gestures. Could focus on a single species or a subset of gestures.
Analysis: How can we link specific gestures to potential communicative intents?
Output ideas
Ideas include: Visual and written representations of your solution such as mock-ups, flowcharts, or conceptual frameworks. Mock-up of an app or interface that could one day facilitate animal-to-human communication. For teams using coding: A prototype, model, code or simulation that analyses images of animals and outputs interpretations of gestures would be one approach.
Resources
Feighelstein, M., Henze, L., Meller, S. et al. Explainable automated pain recognition in cats. Sci Rep 13, 8973 (2023). https://doi.org/10.1038/s41598-023-35846-6