Interpretability Measure

(Redirected from interpretability)

An Interpretability Measure is a measure that quantifies how easily a human can understand the decisions or predictions made by a machine learning model or algorithm.



References

2024

2015