comparemela.com

Latest Breaking News On - Eature attribution - Page 1 : comparemela.com

How well do explanation methods for machine-learning models work?

Feature-attribution methods are used to determine if a neural network is working correctly when completing a task like image classification. MIT researchers developed a way to evaluate whether these feature-attribution methods are correctly identifying the features of an image that are important to a neural network’s prediction.

Yilun-zhou
Julie-shah
Marco-tulio-ribeiro
Microsoft-research
Artificial-intelligence-laboratory
Interactive-robotics-group
National-science-foundation
Computer-science
Serena-booth
National-science
Explainable-artificial-intelligence
Eature-attribution

© 2024 Vimarsana

vimarsana © 2020. All Rights Reserved.