Please use this identifier to cite or link to this item:
https://hdl.handle.net/10316/103238
Title: | On the clinical acceptance of black-box systems for EEG seizure prediction | Authors: | Pinto, Mauro F. Leal, Adriana Costa Lopes, Fábio Pais, José Dourado, António Sales, Francisco Martins, Pedro Teixeira, César A. |
Keywords: | actor network theory; grounded theory; interpretability/explainability; machine learning; seizure prediction | Issue Date: | 2022 | Serial title, monograph or event: | Epilepsia Open | Volume: | 7 | Issue: | 2 | Abstract: | Seizure prediction may be the solution for epileptic patients whose drugs and surgery do not control seizures. Despite 46 years of research, few devices/systems underwent clinical trials and/or are commercialized, where the most recent state-of- the- art approaches, as neural networks models, are not used to their full potential. The latter demonstrates the existence of social barriers to new methodologies due to data bias, patient safety, and legislation compliance. In the form of literature review, we performed a qualitative study to analyze the seizure prediction ecosystem to find these social barriers. With the Grounded Theory, we draw hypotheses from data, while with the Actor-Network Theory we considered that technology shapes social configurations and interests, being fundamental in healthcare. We obtained a social network that describes the ecosystem and propose research guidelines aiming at clinical acceptance. Our most relevant conclusion is the need for model explainability, but not necessarily intrinsically interpretable models, for the case of seizure prediction. Accordingly, we argue that it is possible to develop robust prediction models, including black-box systems to some extent, while avoiding data bias, ensuring patient safety, and still complying with legislation, if they can deliver human-comprehensible explanations. Due to skepticism and patient safety reasons, many authors advocate the use of transparent models which may limit their performance and potential. Our study highlights a possible path, by using model explainability, on how to overcome these barriers while allowing the use of more computationally robust models. | URI: | https://hdl.handle.net/10316/103238 | ISSN: | 2470-9239 2470-9239 |
DOI: | 10.1002/epi4.12597 | Rights: | openAccess |
Appears in Collections: | I&D CISUC - Artigos em Revistas Internacionais |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Epilepsia Open - 2022 - Pinto - On the clinical acceptance of black‐box systems for EEG seizure prediction.pdf | 2.8 MB | Adobe PDF | View/Open |
SCOPUSTM
Citations
7
checked on Dec 25, 2023
Page view(s)
186
checked on Oct 2, 2024
Download(s)
205
checked on Oct 2, 2024
Google ScholarTM
Check
Altmetric
Altmetric
This item is licensed under a Creative Commons License