Photo by Alexander Mielke

This project, awarded to Jerome Micheletta, Bridget Waller, and Julie Duboscq by the Leverhulme Trust, uses network science to untangle the complexity of primate and human facial expression. Facial expressions are the most ubiquitous form of communication in many primate species; they are also the most ephemeral: fleeting and flexible; hard to film and harder to quantify. The last decades have brought advances in capturing the information content of facial communication, namely the adaptation of the muscle-based Facial Action Coding System (FACS), with automated solutions always just a step away. However, statistical approaches to extract meaning from the wealth of data have not been forthcoming. I helped develop ‘NetFACS’, an open access statistical tool (available on GitHub for R) that treats facial muscles as nodes in different types of networks to elucidate their connections with each other, but also their functions in different communicative contexts. NetFACS allows us to study facial expressions not simply as static displays, but as dynamic signals with which individuals convey situation-specific information.

NetFACS is now available on GitHub ( and is maintained by me and Alan Rincon.

Our preprint can be found here: