|Title||Stool Image Analysis for Precision Health Monitoring by Smart Toilets|
|Publication Type||Conference Proceedings|
|Year of Publication||2021|
|Authors||Jin Zhou, Nick DeCapite, Jackson McNabb, Jose R. Ruiz, Deborah A. Fisher, Sonia Grego, and Krishnendu Chakrabarty|
|Conference Name||Machine Learning for Healthcare|
|Publisher||Proceedings of Machine Learning Research|
|Keywords||artificial intelligence, blood, gastroenterology, machine learning, precision health monitoring, stool, Wastewater|
Precision health monitoring is facilitated by long-term data collection that establishes a health baseline and enables the detection of deviations from it. With the advent of the Internet of Things, monitoring of daily excreta from a toilet is emerging as a promising tool to achieve the long-term collection of physiological data. This paper describes a stool image analysis approach that accurately and efficiently tracks stool form and visible blood content using a Smart Toilet. The Smart Toilet, can discreetly image stools in toilet plumbing outside the purview of the user. We constructed a stool image dataset with 3,275 images, spanning all seven types of the Bristol Stool Form Scale, a widely used metric for stool classification. We used ground-truth data obtained through the labeling of our dataset by two gastroenterologists. We addressed three limitations associated with the application of computer-vision techniques to a smart toilet system: (i) uneven separability between different stool form categories; (i) class imbalance in the dataset; (ii) limited computational resources in the microcontroller integrated with the Smart Toilet. We present results on the use of class-balanced loss, and hierarchical and compact convolutional neural network (CNN) architectures for training a stool-form classifier. We also present results obtained using perceptual color quantization coupled with mutual information to optimize the colorfeature space for the detection of stool images with gross (visible) blood content. For the classification of stool-form, we achieve a balanced accuracy of 81.66% using a hierarchical CNN based on MobileNetV2. For gross blood detection, the decision tree (DT) classifier provides 74.64% balanced accuracy.