What makes this algorithm morally wrongful.

In 2020, Robert Julian-Borchak Williams was the first American to be wrongfully arrested based on a flawed match from a facial recognition algorithm. Facial recognition technology has been used by law enforcement to identify suspects for nearly two decades.

The technology works well for white men, but not very well for other demographics. Williams is not white.

Read the newspaper article carefully. Your task (5-6 page paper, 12-point font, double-spaced):

-Descriptively Diagnose. What has likely happened? Why is this software so bad at
generating accurate matches for minorities? Where could the bias have crept into this facial recognition software?

Where might bias be perpetuated? Use the machine learning loop to do this analysis. Explain to an audience that has no background in machine learning, data science, or computer science.

No Comment.