Are Search Engines Racist? Surprising Google Images Results

There has been some interesting research recently about whether or not the name at the top of a résumé impacts whether or not a person will get an interview. Sadly, one study found that “white-sounding” names are downloaded 17% more often than “black-sounding” names and those with black-sounding names were less likely to get callbacks. Could it be because negative perceptions are attached to black-sounding names based on past experiences?

In another somewhat mysterious study, researcher Latanya Sweeney found that on a website that checks public records,, the advertisements from GoogleAds changed significantly depending on the name she typed in. Sweeney discovered that when she typed in a black-sounding name, the ads were 25% more likely to suggest that a person with that name had an arrest record, regardless of whether or not the person had ever been arrested. Could such search results and ad results, in time, affect our perception of people and the names that they are attached to?

I decided to take these research studies and do a quick (and admittedly unscientific) analysis of my own. I wondered what a Google Image search would bring up if I typed in “black” names and “white” names. My results seem just about as mysterious as those from Sweeney’s study. In their book Freakonomics, authors Steven D. Levitt and Stephen J. Dubner compiled a list of the “whitest” and “blackest” names in America. I took the top 10 from each list of boy names and typed them into Google Images to see what I would find. On my computer screen, I looked at all the results that fit on one screen at a time, without scrolling (my screen is a 26″ monitor). Typically, I would be able to view anywhere between 45 and 60 images for each search.

The “black” names I typed in, in 10 separate searches, included the following: Deshawn, DeAndre, Marquis, Darnell, Terrell, Malik, Trevon, Tyrone, Willie, and Dominique. The “white” names I typed in were Jake, Connor, Tanner, Wyatt, Cody, Dustin, Luke, Jack, Scott, and Logan.  Here are my image results for two sample searches, “Scott” and “Deshawn”:

Google Image Search Result for “Scott,” one of 10 “white” name searches.
Google Image search for “Deshawn,” one of 10 “black” name searches.

After 10 searches of black names, I viewed 502 total images. What I found was surprising (and discouraging). Forty-two of those images (about 8% or roughly 1 in every 12) were either mugshots or police criminal photos posted in newspapers or criminal registries.

Of the 10 black names that I typed in, though, 5 were heavily dominated by celebrity images. For example, when I typed in “Terrell,” nearly 100% of the photos were either of Terrell Owens, the NFL football star, or supermodel Mercedes Terrell. Or, when I typed in “Willie,” nearly all of the photos were either of singer Willie Nelson or Willie from The Simpsons TV show. And the search for “DeAndre” pulled up almost entirely photos of American Idol phenom DeAndre Brackensick. Because these searches might be considered exceptional, I wondered what the percentage would be if I were to remove them from my search and look at more randomized images. What I found was that 41 out of 263 total images were of mugshots or crime suspect photos. That is over 15%, or nearly 1 in every 6 images!

What is most interesting is the comparison to the white name search. Of the 506 total images searched, only 2 were of mugshots or crime suspects. That is less than 0.4% or 1 one in every 250. Quite a significant difference! Certainly, more people out there with the name of of Connor or Jake or Wyatt or Luke have a criminal record, but when I searched each of those names, zero of the images were of mugshots or crime suspects.

Does this mean that Google is being racist? Not exactly. To be fair to Google (or any other search engine company), image results in a search engine are based on a number of complex factors and the results change every day. The images that are shown are based on Search Engine Optimization results, which combines factors about site content, number of visits, links to the site, and so forth. Google has control over the algorithm that creates the results, but they have little or no control over the people who run the websites and tweak them to climb search engine results lists. Google may not be racist, but we might wonder why the search results clearly appear to be. What is happening within our culture and on the internet that causes such a discrepancy in the way images displayed in random searches relate to names and crimes?

Regardless of how the names get to the final results page,  I can’t help but wonder about the perceptions we garner daily from seemingly innocuous searches. We might do well to pay closer attention to the advertisements and images that show up when we read material online. Even the advertisements on this webpage were selected by Google according to the content of the article and your own search habits and location. Did you notice what Google thought you wanted to know about?

If you were to randomly type in a man’s name into a Google Images search, you would probably expect to see a wide array of images of people with that name. You might see anything from a family photo to a circus performer to a high school party scene. But would the collection of images as a whole affect the way you perceive that name (and, hence, people who claim that name as their own)? Research has repeatedly shown that our memories are inaccurate, built on collections of images and experiences that we are exposed to throughout our lives. Perhaps unknowingly, when we are exposed to images of people doing certain things or associated with certain topics, it is very possible that we unintentionally skew our perception of them. During the Hurricane Katrina disaster, for example, images were repeatedly shown in the news of black men “looting.” The unintentional consequence of this, however, is that people have a stored memory of black men looting and little, if any, recollection of white men (or women) looting. Perception has changed.

What we type in and search for on the internet, similarly, may have more lasting effects on what we remember than we realize. When we are exposed to images in search engine results over time, it is very possible that we make subconscious links to the search engine terms and the images that are provided to us. It’s an unfortunate, scary, but quite realistic thought. It is especially disconcerting when race is involved and image searches provide disturbing results.

4 thoughts on “Are Search Engines Racist? Surprising Google Images Results

  • August 13, 2013 at 12:21 pm

    This is interesting, but I’d say the logic is flawed. Computers are literally color blind. They cannot be racist unless they are specifically programmed to be so.

    Google has fairly strong, fairly accurate algorithms for sifting through the massive amounts of information on the internet and coming up with the most pertinent results for any given search. While I am sure flaws exist, I’d imagine the search results that were found are in general a reasonable representation of what’s out there, and if so, I don’t think a sampling of available images, presented with no racial bias whatsoever (not humanly possible, of course, but available through computers) can really be labeled racist.

    I think we’ve reached a point as a society of crying “racism” every time any differences between races are found. Though some may be caused or manifest through racism, I think we would serve society and members of all races better by acknowledging differences and learning from them, and then we might have wisdom and understanding to pursue effective solutions to whatever problems and inequities might exist.

    There could be any number of explanations for the results found: (not suggesting any of these are the actual causes, but illustrating the variety of possibilities). Maybe the group with the “most black” names lack the resources to have their own webpages or be featured on others’. Maybe they’ve been systematically oppressed by “white” society and excluded from online mention. Maybe only Geotechnic Karfoople professors have their biographies posted online, and they’re all named Wyatt and Connor. Maybe Malik and Tyrone are more athletically inclined or have an affinity to nature and spend more time outdoors instead of in front of computers, posting selfies. Maybe a secret society of hackers is prowling the internet and removing positive pictures of people named Deshawn. And maybe, just maybe, a lot more people named Darnell are out there getting their mug shot taken than people named Luke. Maybe the “black” names that were chosen for review are simply examples of unique names, and bearers of unusual names are more likely to perform as positive or negative outliers (hence multiple celebrities as well as multiple crime suspects) and trend away from typical or average outcomes.

    In the end it’s always important to remember there is a difference between the online conglomeration of information and physical reality, but online data can characterize and reflect the “real world” to varying degrees. If racial disparities come up when considering the massive quantities of data used and filtered by google searches (assuming that this name search results actually does display racial disparities rather than characterizing a smaller group with a particular naming culture), perhaps we can consider the possibility that a disparity might actually exist with whatever causes and significance might appear upon further investigation, rather than labeling the situation “racist” and attributing it purely to some type of unfair bias and bracing ourselves against being influenced by it.

    The human mind is designed to make generalizations. If we see many black criminal suspects and few of other races, it’s going to be difficult or impossible for just about anyone not to be influenced by that.

    If that’s due to unfair reporting (if a higher proportion of images of black criminals than white criminals are shared with the public), or unfair prosecution (if more black criminals than white criminals are prosecuted) that would be good reason to cry “racism”, but these image search results don’t support that so far. There’s also not enough evidence to say conclusively that a higher proportion of blacks (or rather: members of the naming culture considered) end up with their mug shots or other criminal photos taken, but it does seem to be the simplest and most likely explanation of the situation. (applying occam’s razor).

    I think it’s definitely worth investigating to see if the representations are fair and accurate, and if they’re not we should try to remedy the situation, but unless we have some reason to believe that they’re not, it doesn’t seem very productive to try to avoid being influenced or biased by reality, (or the best/most available representation of it that we have), because that’s a losing battle. The human mind will synthesize and generalize continuously as long as there is input. If we want to protect it from inaccuracy, we should strive for accurate input, instead of coaching ourselves and others to reject some input out of hand simply because it may lead to conclusions that aren’t politically correct.

  • April 10, 2014 at 10:37 pm

    I do notice the wording attached to photographs in news articles. For example, MSN will use a negative sounding caption when referencing a person of color more often than with a European (American). When showing a picture of the Pennsylvania teen stabber, he is referenced and ‘not a wierdo’. When referencing two separate African American sports figures, one has a slowest homerun record, and the other is kicked off a team. Yet another references a man of color as ‘Kaepernick’ investigated in suspicious incident. Thess references can be blatant or very subtle and I believe it is intentional. The same with Yahoo and Bing. I also see MANY more negative comments about our current president than any other in my lifetime (I am 63).

  • October 21, 2014 at 9:05 pm

    Try this little experiment: Go to and choose “Images”. Then type in “Jew” and see the results. Count the total number of images given to you on the first try (it’s a very finite number) and then count the number of images you would classify as racist. Calculate the percentage of racist images.

    Then type in “Muslim” and do the same count. You will find that the Jewish search brings up a predominantly racist result and the Muslim one does not. That is Google’s anti-semitic leaning.

  • October 23, 2014 at 11:18 pm

    I was testing this idea out too but adding a little personal search “curl hairstyle” and most if not all of the images were American/white/Caucasian females celebrities. Try “straight hairstyle” and you get similar results. You won’t get Black or Asian until you specifically type it in as a key word…. interesting and unfortunate

Comments are closed.