Lessons from Usable ML Deployments and Application to Wind Turbine Monitoring

Published: 27 Oct 2023, Last Modified: 23 Nov 2023NeurIPS XAIA 2023EveryoneRevisionsBibTeX
TL;DR: We are incorporating lessons learned from past work in deploying usable and explainable ML to real-world domains to improve the process of wind turbine monitoring.
Abstract: Through past experiences deploying what we call usable ML (one step beyond explainable ML, including both explanations and other augmenting information) to real-world domains, we have learned three key lessons. First, many organizations are beginning to hire people who we call "bridges" because they bridge the gap between ML developers and domain experts, and these people fill a valuable role in developing usable ML applications. Second, a configurable system that enables easily iterating on usable ML interfaces during collaborations with bridges is key. Finally, there is a need for continuous, in-deployment evaluations to quantify the real-world impact of usable ML. Throughout this paper, we apply these lessons to the task of wind turbine monitoring, an essential task in the renewable energy domain. Turbine engineers and data analysts must decide whether to perform costly in-person investigations on turbines to prevent potential cases of brakepad failure, and well-tuned usable ML interfaces can aid with this decision-making process. Through the applications of our lessons to this task, we hope to demonstrate the potential real-world impact of usable ML in the renewable energy domain.
Submission Track: Full Paper Track
Application Domain: None of the above / Not applicable
Clarify Domain: XAI in action for renewable energy
Survey Question 1: We apply practical lessons we've learned from past experiences related to the roles, evaluation methods, and systems required for deploying usable and explainable ML to the real-world domain of wind turbine monitoring. We explain the challenges involved in developing an explainable ML system that is tuned for the needs of the domain. We also discuss the many challenges involved in effectively evaluating the impact of explainable ML on a real-world domain, and introduce our plans to do so.
Survey Question 2: In the wind turbine monitoring domain, the ML model alerts users to potential cases of turbine brakepad failure. Adding explanations and other augmenting information to this alert helps users determine if action needs to be taken; unnecessarily inspecting a turbine in-person and missing a real case of brakepad failure are both costly mistakes. Additionally, explanations provide users with additional information that can help them pinpoint the potential cause of failure, making it easier for them to suggest next steps.
Survey Question 3: Our work uses SHAP, nearest neighbors, and dataset visualization.
Submission Number: 18
Loading