Abstract: Automated dietary monitoring is essential for gaining insights into eating behaviors, especially for managing chronic conditions such as obesity, diabetes, and hypercholesterolemia. Earable-based inertial sensing has been found promising for detecting chewing and eating activities; however, further insights like what, when, and how much is being eaten are crucial information for effective dietary assessment. Therefore, we propose BiteSense, an earable-based system that leverages inertial sensors (IMU) to monitor food intake and classify various food types. Using a hierarchical classification model, the system analyzes masticatory kinematics to detect food states, textures, nutritional value, and cooking methods, ultimately identifying specific foods consumed, as well as estimating food intake amount and meal type. A semi-controlled user study involving 38 participants from diverse backgrounds demonstrated the system’s high accuracy, with an F1 score of 0.86 for detecting the masticatory process using a leave-one-subject-out (LOSO) approach, while exhibiting significant improvement over benchmark algorithms in extensive experiments by 8-12%.
Loading