You can feel the stress building—you’re on deadline, your computer has stalled to a standstill, you’re pounding keys in frustration, and your blood is boiling. You’re about to explode. And at that exact moment, your computer tells you to take deep breath and a walk.
Thanks to a team of Microsoft researchers within the VIBE group (Visualization and Interaction for Business and Entertainment) within Microsoft Research, the technology that would make that intervention possible is a work in progress focusing on human-computer interaction and clinical psychology. Three years ago, the team started working in the area of affective computing: designing systems—some including wearable computing devices—that attempt to identify your mood and react accordingly, in order to help you reflect on your own state.
There are all kinds of ways a system could detect what you’re feeling, such as utilizing a variety of sensors that monitor your facial features, how quickly you are typing, the intensity of each keystroke, or the stress in your voice. The combination of machine learning and data analytics could potentially tie together all this data to predict accurately how you are feeling. Your computer may not be able to read you—yet—but research in affective computing could bring that to a reality soon. A key tenet of the team’s work is understanding and aiding emotional health to improve the quality of life. Czerwinski says: “Our research goes beyond traditional fitness. It’s about emotional fitness.”