Wearable movement and physiology sensors such as smartwatches, smart glasses, and ear-buds offer lightweight, non-invasive, and ecologically valid means to monitor human activity, affective state, and social behavior. With the rise of commercially deployed devices and new wearable foundation models, opportunities for scalable human behavior analysis continue to grow. However, challenges such as personalized modeling, on-device integration, or multimodal fusion prevail, limiting in-the-wild deployment of wearable devices.
Recent research has focused on advancing both modeling and sensing capabilities of wearable systems. In particular, the emergence of wearable foundation models enables researchers to work with expressive representations that scale to large populations and capture complex signals. On the other hand, researchers are developing increasingly sophisticated devices to capture physiological signals non-invasively, i.e., remote photoplethysmography (rPPG) which can measure heart rate from videos of the face. These advances have enabled the widespread adoption of commercial wearables such as smartwatches, allowing individuals to monitor their sleep, mood, or health. They are also supporting a variety of applications, from tracking affective states and social behaviors, to support human-robot interaction, mobile health, and behavioral research in both laboratory and real-world contexts.
However, significant challenges remain in both computational modeling and sensor design. Human behavior is inherently complex, context-dependent, and individual, making its analysis through wearable sensing particularly challenging. Even in controlled environments where task complexity is limited, learning personalized and/or generalized models remains difficult due to the high variability across individuals and our incomplete understanding of the underlying physiological mechanisms. This challenge is directly related to the multimodal nature of human behaviors, with different physiological or movement signals conveying distinct yet complementary information. Finally, as wearables are designed to monitor people in real-world, uncontrolled settings, they bring additional concerns related to privacy, ethical use, and data integrity.
The 1st Workshop on Behavioral and Emotion Analysis through wearable Technology (BEAT) aims to foster collaboration between ML researchers from various backgrounds (Gesture & Face Analysis, Affective Computing, HRI), as well as researchers in biomedical engineering. The main focus is on lightweight wearable movement and physiological sensors for computational human behavior analysis. While contributions are expected to be centered on real-world and ecologically valid settings, we also welcome controlled laboratory studies that introduce novel sensing approaches, benchmark datasets, or innovative applications.
Topics of interest include, but are not limited to:
- Machine Learning and computational models for movement and physiological wearables
- Resource efficient and lightweight models
- Multimodal fusion and synchronization strategies
- Methods for irregularly sampled or missing data
- Individual differences, personalization and context-awareness
- Ethical and privacy-preserving AI in wearable systems
- Novel wearables and applications
- Experimental methods for validation of wearable systems
- Lab-controlled experiments and In-the-wild deployment
- Datasets and Benchmarks
- Responsible data management and user consent
- Applications in Affective Computing / Mobile Health / Action Recognition / Social Interaction / HRI








