Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
ExtremeTech Is Corrupt
#1
https://www.extremetech.com/computing/27...ition-tech
I called out how Joel committed a massive failure to do his research. Now my comment below has gone from already approved to "Hold on, this is waiting to be approved by ExtremeTech". Talk about corrupt censorship:
Quote:Joel failed to do his research. The ACLU's investigation was deeply flawed: https://itif.org/publications/2018/07/30...lic-safety
"The ACLU may be reluctant to release its code because it appears to have poorly implemented the software. The ACLU has admitted that it only used a confidence threshold of 80 percent. This threshold is suitable for some uses, such as tagging friends on social media, but Amazon recommends using a 99 percent threshold when higher levels of accuracy are necessary, such as in law enforcement. Indeed, Dr. Matt Wood, who oversees machine learning at Amazon Web Services, says that the ACLU’s reported error rate would drop from 5 percent to 0 percent at this higher confidence level.

Similarly, the ACLU may be reluctant to release its data because doing so could undermine its claims of bias. The ACLU says that Rekognition is biased because nearly 40 percent of the incorrect matches were people of color, while only 20 percent of members of Congress are people of color. Indeed, at first glance, this may seem like bias. But is that the right metric? What if the database of mugshots consisted of 60 percent people of color? Whether intentional or not, it is possible that the ACLU built its mugshot database to make it more likely for people of color to be wrongly matched if racial minorities were disproportionately represented in the dataset.

Certainly, it is possible that Rekognition performs better on lighter-skinned individuals compared to darker-skinned ones. Indeed, the National Institute of Standards and Technology (NIST) regularly tests many facial recognition algorithms and has documented how error rates vary by race and gender, and companies like Microsoft and IBM have announced initiatives to address these discrepancies. However, the ACLU was selective in claiming bias: It failed to publicize the fact that less than 4 percent of the false matches were female even though women hold 20 percent of the seats in Congress.

It is unclear what error rate and level of bias groups such as the ACLU are willing to accept. The standard should not be perfection, but rather better than the rates humans achieve today. And by that metric, facial recognition technology is clearly a positive step forward. Not only does it dramatically outperform humans in terms of matching speed (and cost), but it also beats humans on accuracy. And while different algorithms have variations in terms of accuracy depending on race and gender, these differences generally pale in comparison to well-documented human biases.

The ACLU’s most recent gimmick is unfortunately another setback for legitimate uses of facial recognition by law enforcement, which not only includes identifying suspects—as was the case in the Capital Gazette shooting—but also aiding victims of human trafficking and child exploitation.
Moreover, it is a distraction from legitimate efforts to improve facial recognition, as well as necessary conversations about how to address serious instances of abuse, misconduct, and bias in policing."
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)