{"id":55,"date":"2020-08-13T10:44:07","date_gmt":"2020-08-13T14:44:07","guid":{"rendered":"https:\/\/www.macloo.com\/ai\/?p=55"},"modified":"2020-08-24T12:06:44","modified_gmt":"2020-08-24T16:06:44","slug":"racial-and-gender-bias-in-ai","status":"publish","type":"post","link":"https:\/\/www.macloo.com\/ai\/2020\/08\/13\/racial-and-gender-bias-in-ai\/","title":{"rendered":"Racial and gender bias in AI"},"content":{"rendered":"\n<p>Different AI systems do different things when they attempt to <strong>identify humans.<\/strong> Everyone has heard about face recognition (a k a <em>facial<\/em> recognition), which you might expect would return a name and other personal data about a person whose face is &#8220;seen&#8221; with a camera. <\/p>\n\n\n\n<p>No, not always.<\/p>\n\n\n\n<p>A system that analyzes human faces might simply try to return information about the person that you or I would tag in our minds when we see a stranger. The person&#8217;s gender, for example. That&#8217;s relatively easy to do most of the time for most humans \u2014 but it turns out to be tricky for machines. <\/p>\n\n\n\n<p>Machines often get it wrong when trying to identify the gender of a trans person. But machines <em>also<\/em> misidentify the gender of people of color. In particular, they have a big problem recognizing Black women as <em>women<\/em>.<\/p>\n\n\n\n<p>A short and good <a rel=\"noreferrer noopener\" href=\"https:\/\/time.com\/5520558\/artificial-intelligence-racial-gender-bias\/\" target=\"_blank\">article<\/a> about this ran in <em>Time<\/em> magazine in 2019, and the accompanying video is well worth watching. It shows various face recognition software systems at work.<\/p>\n\n\n\n<p>Another serious problem concerns differentiating among people of Asian descent. When apartment buildings and other housing developments have installed face recognition as a security system \u2014 to open for residents and stay locked for others \u2014 the Asian residents can find themselves <a rel=\"noreferrer noopener\" href=\"https:\/\/www.wired.com\/story\/cities-examine-proper-improper-facial-recognition\/\" target=\"_blank\">locked out of their own home<\/a>. The doors can also <em>open<\/em> for Asian people who <em>don&#8217;t<\/em> live there.<\/p>\n\n\n\n<p>You can find a lot of articles about this widespread and very serious problem with AI technology, including the deservedly famous <a rel=\"noreferrer noopener\" href=\"https:\/\/www.aclu.org\/blog\/privacy-technology\/surveillance-technologies\/amazons-face-recognition-falsely-matched-28\" target=\"_blank\">mug shots test<\/a> by the American Civil Liberties Union.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>\u201cWhile it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied.\u201d<\/p><cite>\u2014Patrick Grother, NIST computer scientist<\/cite><\/blockquote>\n\n\n\n<p>So how does this happen? How do companies with almost infinite resources deploy products that are so seriously \u2014 and even dangerously \u2014 flawed?<\/p>\n\n\n\n<p><a href=\"https:\/\/www.macloo.com\/ai\/2020\/08\/12\/ask-a-computer-to-draw-what-it-sees\/\">Yesterday I wrote<\/a> a little about training data for object-detection AI. To identify any image, or any <em>part<\/em> of an image, an AI system is usually <em>trained<\/em> on an immense set of images. If you want to identify human faces, you feed the system hundreds of thousands, or even millions, of pictures of human faces. If you&#8217;re using <strong>supervised learning<\/strong> to train the system, the images are labeled: Man, woman. Black, white. Old, young. Convicted criminal. Sex offender. Psychopath.<\/p>\n\n\n\n<p><em>Who<\/em> is in the images? <em>How<\/em> are those images labeled?<\/p>\n\n\n\n<p>This is <em>part<\/em> of how the whole thing goes sideways. There&#8217;s more to it, though. Before a system is marketed, or released to the public, its developers are going to <em>test it<\/em>. They&#8217;re going to test the <em>hell<\/em> out of it. This can be compared with when an AI is developed that plays a particular game, like Go, or chess. After the system has been trained, you test it. To test the system, you&#8217;re going to <a href=\"https:\/\/www.macloo.com\/ai\/2020\/08\/10\/ai-programs-that-play-games\/\">have it play, and see if it can win<\/a> \u2014 consistently. So when developers create a face recognition system, and they&#8217;ve tested it extensively, and they say, great, now it&#8217;s ready for the public, it&#8217;s ready for commercial use \u2014 ask yourself how they missed these glaring flaws.<\/p>\n\n\n\n<p>Ask yourself how they <em>missed the fact<\/em> that the system can&#8217;t differentiate between various Asian faces.<\/p>\n\n\n\n<p>Ask yourself how they <em>missed the fact <\/em>that the system identifies Black women as men.<\/p>\n\n\n\n<p>Fortunately, in just the past year these flaws have received so much attention that a number of large firms (Amazon, IBM, Microsoft) have pulled back on commercial deployments of face recognition technologies. Whether they will be able to build more trustworthy systems remains to be seen. <\/p>\n\n\n\n<p>More about bias in face recognition systems:<\/p>\n\n\n\n<ul class=\"wp-block-list\"><li><a rel=\"noreferrer noopener\" href=\"https:\/\/www.fastcompany.com\/90525023\/most-creative-people-2020-joy-buolamwini\" target=\"_blank\">Meet the computer scientist and activist who got Big Tech to stand down<\/a> (<em>Fast Company, <\/em>August 2020) \u2014 about AI researcher Joy Buolamwini <\/li><li><a rel=\"noreferrer noopener\" href=\"https:\/\/www.nist.gov\/news-events\/news\/2019\/12\/nist-study-evaluates-effects-race-age-sex-face-recognition-software\" target=\"_blank\">NIST Study Evaluates Effects of Race, Age, Sex on Face Recognition Software<\/a> (December 2019) \u2014 this article is the source for the pullout quote above <\/li><li><a rel=\"noreferrer noopener\" href=\"https:\/\/www.designbetter.co\/conversations\/timnit-gebru\" target=\"_blank\">Timnit Gebru: Machine learning, bias, and product design<\/a> (March 2018) \u2014 although this interview is older, I like how Gebru speaks about design: &#8220;We need more people who think about design <em>working in AI, <\/em>because oftentimes what&#8217;s happening is <em>little things<\/em>. &#8230; think about Siri or Alexa or these personal assistants who are women \u2014 what does that do to society, just portraying that stereotype?&#8221;<\/li><li><a rel=\"noreferrer noopener\" href=\"https:\/\/www.blog.google\/technology\/ai\/googlers-leading-machine-learning-fairness\/\" target=\"_blank\">Meet the Googlers working to ensure tech is for everyone<\/a> (May 2020) \u2014 Gebru is now an engineer at Google. Along with colleagues Tiffany Deng and Tulsee Doshi, she works on ensuring fairness in AI.<\/li><\/ul>\n\n\n\n<p><a rel=\"license\" href=\"http:\/\/creativecommons.org\/licenses\/by-nc-nd\/4.0\/\"><img decoding=\"async\" alt=\"Creative Commons License\" style=\"border-width:0\" src=\"https:\/\/i.creativecommons.org\/l\/by-nc-nd\/4.0\/88x31.png\"><\/a><br>\n<small><span xmlns:dct=\"http:\/\/purl.org\/dc\/terms\/\" property=\"dct:title\"><strong>AI in Media and Society<\/strong><\/span> by <span xmlns:cc=\"http:\/\/creativecommons.org\/ns#\" property=\"cc:attributionName\">Mindy McAdams<\/span> is licensed under a <a rel=\"license\" href=\"http:\/\/creativecommons.org\/licenses\/by-nc-nd\/4.0\/\">Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License<\/a>.<br>\nInclude the author&#8217;s name (Mindy McAdams) and a link to the original post in any reuse of this content.<\/small><\/p>\n\n\n\n<p>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Different AI systems do different things when they attempt to identify humans. Everyone has heard about face recognition (a k a facial recognition), which you might expect would return a name and other personal data about a person whose face is &#8220;seen&#8221; with a camera. No, not always. A system that analyzes human faces might&hellip; <a class=\"more-link\" href=\"https:\/\/www.macloo.com\/ai\/2020\/08\/13\/racial-and-gender-bias-in-ai\/\">Continue reading <span class=\"screen-reader-text\">Racial and gender bias in AI<\/span> <span class=\"meta-nav\" aria-hidden=\"true\">&rarr;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[6,3],"tags":[21,23,22],"class_list":["post-55","post","type-post","status-publish","format-standard","hentry","category-ethics-and-bias","category-image-recognition","tag-face_recognition","tag-gender","tag-race"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/posts\/55","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/comments?post=55"}],"version-history":[{"count":10,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/posts\/55\/revisions"}],"predecessor-version":[{"id":193,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/posts\/55\/revisions\/193"}],"wp:attachment":[{"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/media?parent=55"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/categories?post=55"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.macloo.com\/ai\/wp-json\/wp\/v2\/tags?post=55"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}