Is Google\u2019s search algorithm guilty of racism? A study by a Harvard researcher found that it could be.\n\tProfessor Latanya Sweeney says she found "statistically significant discrimination" when comparing ads served with results from online searches made using names associated with blacks and those associated with whites. Sweeney, who is African-American, began her study after learning that a Google search for her own name produced an ad for a background check service, hinting that she'd been arrested.\n\tHer hypothesis: Names that are associated with African-Americans, such as Latanya, are more likely to trigger negative ad associations than names such as Jill, that aren't.\n\tI haven\u2019t spoken to Sweeney (her picture is on the left), but I don\u2019t believe she is accusing Google of deliberate racism, and neither am I. It would be easy for some to dismiss her work as torturous political correctness, but that\u2019s wrong too. What is important about her work, I believe, is that it gives some insight into the experience that African Americans may have on the Internet.\n\tAfter all, doing a search on your own name (I bet you've done that) and being served an ad for a service called Instantcheckmate that implies you are a criminal, has got to feel demeaning. When Sweeney searched on her name, Instantcheckmate ads saying \u201cLatanya Sweeney, Arrested?\u201d and \u201cCheck Latanya Sweeney\u2019s Arrests,\u201d appeared in the paid results part of the search page.\n\tSuppose her daughter had seen that?\n\tWhen Sweeney then clicked on those ads and paid the fee, it turned out, not surprisingly, that there was no record of her being arrested. By way of contrast, she did search for the names "Kristen Haring", "Kristen Sparrow" and "Kristen Lindquist." Ads came up, but not from Instantcheckmate or other similar services. But when she searched on those names in Instantcheckmate, there were arrest records for two of those women in the company database.\n\tOn Reuters.com, which uses Google AdSense to serve ads, a "black-identifying name was 25 percent more likely to get an ad suggestive of an arrest record," Sweeney found. On Google, 92 percent of ads appearing next to black-identifying names suggested a criminal record, compared to 80 percent of white-identifying names, she wrote.\n\tIt's not clear what's at the bottom of this. Google, of course, denies that it engages in what you'd have to call racial profiling. It may be that Instant Checkmate, which had the most online ads of any company tracked in the study, chose to link black-identifying names with ad templates suggesting a criminal record, though the company told Sweeney that it doesn't do that.\n\t"There is discrimination in delivery of these ads," Sweeney writes in her report. "Notice that racism can result, even if not intentional, and that online activity may be so ubiquitous and intimately entwined with technology design that technologists may now have to think about societal consequences like structural racism in the technology they design."\n\tI suspect that what may be going on here has to do with the type of searches millions of people make every day. Google's algorithm does track searches and uses that information to make search more relevant. If enough people are searching on black-sounding names and terms like crime, that might explain this.\n\tUltimately, I bet we'll never find out, but it's worth thinking about the ways the Web affects all of us in some very unexpected ways.