Why Business Leaders Need To Understand Their Algorithms
One of the biggest sources of anxiety about AI is not that it will turn against us, but that we simply cannot understand how it works. The solution to rogue systems that discriminate against women in credit applications or that make racist recommendations in criminal sentencing, or that reduce the number of black patients identified as needing extra medical care, might seem to be “explainable AI.” But sometimes, what’s just as important as knowing “why” an algorithm made a decision, is being able to ask “what” it was being optimized for in the first place? Machine-learning algorithms are often called a black box because they resemble a closed system that takes an input and produces an output, without any explanation as to why. Knowing “why” is important for many industries, particularly those with fiduciary obligations like consumer finance, or in healthcare and education, where vulnerable lives are involved, or in military or government applications, where you need to be able to justify your decisions to the electorate. Read more at Harvard Business Review.