Logo del repository
  1. Home
 
Opzioni

Towards End-to-End Explainability in Food Science

ARRIGHI, LEONARDO
  • doctoral thesis

Abstract
In the food industry, Artificial Intelligence (AI) techniques play a fundamental role, especially in product quality analysis, a field that requires processing large volumes of heterogeneous data and handling the complexity inherent in the subjective assessment of quality that is often needed in this sector. In response to the need for reliable AI models, EXplainable Artificial Intelligence (XAI) has emerged, providing explanations of how models make decisions and improving their transparency and interpretability. XAI is therefore relevant in the food industry, since it makes AI technologies more transparent, safer, and applicable even to delicate products such as food. In this doctoral thesis, we explore the application of XAI methods to the analysis of food product quality, proposing innovative solutions that integrate into an XAI pipeline across the food supply chain. More specifically, we use tabular data, which are common in the food industry, and tree-based ensemble models in their analysis. We introduce a new XAI technique, the Decision Predicate Graph (DPG), which helps understand how these models make decisions. DPG is a graph structure that captures relationships among features, logical decisions, and model predictions. We demonstrate the effectiveness of DPG in three case studies involving three types of fruit at different ripeness stages, using physicochemical data to improve AI-based predictions of food quality parameters. We examine the development of a complete AI workflow for quality analysis. XAI is necessary not only for understanding the final model but also for improving data preparation, reducing bias, and boosting performance. We propose a new XAI method to explain Isolation Forest, a technique used to identify potential outliers. The proposed approach builds on DPG and provides a comprehensive view of the decision process in outlier detection, thereby enabling an end-to-end XAI-based pipeline. Finally, we evaluate the proposed methods in a real case study and assess their effectiveness in an important operational context such as finance. In conclusion, our findings highlight the effectiveness of XAI methods in improving the analysis of food product quality, making predictions more accurate and reliable, reducing bias, and supporting the adoption of advanced AI technologies to ensure high standards of quality and safety.
In the food industry, Artificial Intelligence (AI) techniques play a fundamental role, especially in product quality analysis, a field that requires processing large volumes of heterogeneous data and handling the complexity inherent in the subjective assessment of quality that is often needed in this sector. In response to the need for reliable AI models, EXplainable Artificial Intelligence (XAI) has emerged, providing explanations of how models make decisions and improving their transparency and interpretability. XAI is therefore relevant in the food industry, since it makes AI technologies more transparent, safer, and applicable even to delicate products such as food. In this doctoral thesis, we explore the application of XAI methods to the analysis of food product quality, proposing innovative solutions that integrate into an XAI pipeline across the food supply chain. More specifically, we use tabular data, which are common in the food industry, and tree-based ensemble models in their analysis. We introduce a new XAI technique, the Decision Predicate Graph (DPG), which helps understand how these models make decisions. DPG is a graph structure that captures relationships among features, logical decisions, and model predictions. We demonstrate the effectiveness of DPG in three case studies involving three types of fruit at different ripeness stages, using physicochemical data to improve AI-based predictions of food quality parameters. We examine the development of a complete AI workflow for quality analysis. XAI is necessary not only for understanding the final model but also for improving data preparation, reducing bias, and boosting performance. We propose a new XAI method to explain Isolation Forest, a technique used to identify potential outliers. The proposed approach builds on DPG and provides a comprehensive view of the decision process in outlier detection, thereby enabling an end-to-end XAI-based pipeline. Finally, we evaluate the proposed methods in a real case study and assess their effectiveness in an important operational context such as finance. In conclusion, our findings highlight the effectiveness of XAI methods in improving the analysis of food product quality, making predictions more accurate and reliable, reducing bias, and supporting the adoption of advanced AI technologies to ensure high standards of quality and safety.
Archivio
https://hdl.handle.net/11368/3124118
https://ricerca.unityfvg.it/handle/11368/3124118
Diritti
open access
FVG url
https://arts.units.it/bitstream/11368/3124118/2/Final_Version_Arrighi.pdf
Soggetti
  • Machine Learning

  • Explainable AI

  • Responsible AI

  • XAI Pipeline

  • Food AI

  • Settore INF/01 - Info...

google-scholar
Get Involved!
  • Source Code
  • Documentation
  • Slack Channel
Make it your own

DSpace-CRIS can be extensively configured to meet your needs. Decide which information need to be collected and available with fine-grained security. Start updating the theme to match your nstitution's web identity.

Need professional help?

The original creators of DSpace-CRIS at 4Science can take your project to the next level, get in touch!

Realizzato con Software DSpace-CRIS - Estensione mantenuta e ottimizzata da 4Science

  • Impostazioni dei cookie
  • Informativa sulla privacy
  • Accordo con l'utente finale
  • Invia il tuo Feedback