The code is written so as to capture an image of the user over a set interval (say every 2 minutes), then grab that image data and post it to the Azure API as a blob. Microsoft then returns analysis data of that image, which is fed into a series of custom algorithms to monitor changes versus that user's previous images/analyses. Three consecutive increases in an "anger" score greater than 50% in magnitude, for example, would trigger a pop-up indicating the user should take a 5-minute break. Currently the app is built to monitor the user's smile, anger score, and contempt score.
Project Members: Dave Trabka