Tag Archives: stereotypes

Removing gender-based stereotypes from algorithms

Most people don’t think of algorithms as having biases and stereotypes but Michael Zou in his Sept. 26, 2016 essay for The Conversation (h/t phys.org Sept. 26, 2016 news item) says different, Note: Links have been removed,

Machine learning is ubiquitous in our daily lives. Every time we talk to our smartphones, search for images or ask for restaurant recommendations, we are interacting with machine learning algorithms. They take as input large amounts of raw data, like the entire text of an encyclopedia, or the entire archives of a newspaper, and analyze the information to extract patterns that might not be visible to human analysts. But when these large data sets include social bias, the machines learn that too.

A machine learning algorithm is like a newborn baby that has been given millions of books to read without being taught the alphabet or knowing any words or grammar. The power of this type of information processing is impressive, but there is a problem. When it takes in the text data, a computer observes relationships between words based on various factors, including how often they are used together.

We can test how well the word relationships are identified by using analogy puzzles. Suppose I ask the system to complete the analogy “He is to King as She is to X.” If the system comes back with “Queen,” then we would say it is successful, because it returns the same answer a human would.

Our research group trained the system on Google News articles, and then asked it to complete a different analogy: “Man is to Computer Programmer as Woman is to X.” The answer came back: “Homemaker.”

Zou explains how a machine (algorithm) learns and then notes this,

Not only can the algorithm reflect society’s biases – demonstrating how much those biases are contained in the input data – but the system can potentially amplify gender stereotypes. Suppose I search for “computer programmer” and the search program uses a gender-biased database that associates that term more closely with a man than a woman.

The search results could come back flawed by the bias. Because “John” as a male name is more closely related to “computer programmer” than the female name “Mary” in the biased data set, the search program could evaluate John’s website as more relevant to the search than Mary’s – even if the two websites are identical except for the names and gender pronouns.

It’s true that the biased data set could actually reflect factual reality – perhaps there are more “Johns” who are programmers than there are “Marys” – and the algorithms simply capture these biases. This does not absolve the responsibility of machine learning in combating potentially harmful stereotypes. The biased results would not just repeat but could even boost the statistical bias that most programmers are male, by moving the few female programmers lower in the search results. It’s useful and important to have an alternative that’s not biased.

There is a way according to Zou that stereotypes can be removed,

Our debiasing system uses real people to identify examples of the types of connections that are appropriate (brother/sister, king/queen) and those that should be removed. Then, using these human-generated distinctions, we quantified the degree to which gender was a factor in those word choices – as opposed to, say, family relationships or words relating to royalty.

Next we told our machine-learning algorithm to remove the gender factor from the connections in the embedding. This removes the biased stereotypes without reducing the overall usefulness of the embedding.

When that is done, we found that the machine learning algorithm no longer exhibits blatant gender stereotypes. We are investigating applying related ideas to remove other types of biases in the embedding, such as racial or cultural stereotypes.

If you have time, I encourage you to read the essay in its entirety and this June 14, 2016 posting about research into algorithms and how they make decisions for you about credit, medical diagnoses, job opportunities and more.

There’s also an Oct. 24, 2016 article by Michael Light on Salon.com on the topic (Note: Links have been removed),

In a recent book that was longlisted for the National Book Award, Cathy O’Neil, a data scientist, blogger and former hedge-fund quant, details a number of flawed algorithms to which we have given incredible power — she calls them “Weapons of Math Destruction.” We have entrusted these WMDs to make important, potentially life-altering decisions, yet in many cases, they embed human race and class biases; in other cases, they don’t function at all.
Among other examples, O’Neil examines a “value-added” model New York City used to decide which teachers to fire, even though, she writes, the algorithm was useless, functioning essentially as a random number generator, arbitrarily ending careers. She looks at models put to use by judges to assign recidivism scores to inmates that ended up having a racist inclination. And she looks at how algorithms are contributing to American partisanship, allowing political operatives to target voters with information that plays to their existing biases and fears.

I recommend reading Light’s article in its entirety.

Women in nanoscience and other sciences too

Last week, three women were honoured for their work in nanoscience with  L’Oréal Singapore for Women in Science Fellowships (from the news item on Nanowerk),

In its second year, the Fellowships is organised with the support of the Singapore National Commission for UNESCO and in partnership with the Agency for Science, Technology and Research (A*STAR). The Fellowships aim to recognise the significant contribution of talented women to scientific progress, encourage young women to pursue science as a career and promote their effective participation in the scientific development of Singapore.

The three outstanding women were awarded fellowships worth S$20,000 to support them in their doctorate or post-doctorate research. This year’s National Fellows are:

– Dr. Low Hong Yee, 2010 L’Oréal Singapore For Women in Science National Fellow and Senior Scientist at A*STAR’s Institute of Materials Research and Engineering. Her work in nanoimprint technology, an emerging technique in nanotechnology, focuses on eco solutions and brings to reality the ability to mimic and apply on synthetic surfaces the structure found in naturally occurring exteriors or skin such as the iridescent colours of a butterfly’s wings or the water-proofing of lotus leaves. This new development offers an eco-friendly, non-chemical method to improve the properties and functionalities of common plastic film.

– Dr. Madhavi Srinivasan, 2010 L’Oréal Singapore For Women in Science National Fellow and Assistant Professor at the Nanyang Technological University. Dr Srinivasan seeks to harness the power of nanoscale materials for the answer to the future of energy storage. Such technologies are vital for the future of a clean energy landscape. Its applications include powering electric vehicles, thus reducing overall CO2 emission, and reducing global warming or enhancing renewable energy sources (solar/wind), thus reducing pollution and tapping on alternative energy supplies.

– Dr. Yang Huiying, 2010 L’Oréal Singapore For Women in Science National Fellow and Assistant Professor at Singapore University of Technology and Design. Dr Yang’s fascination with the beauty of the nano-world prompted her research into the fabrication of metal oxide nanostructures, investigation of their optical properties, and the development of nanophotonics devices. These light emitting devices will potentially be an answer to the need for energy-saving and lower cost display screens, LED bulbs, TV and DVD players etc.

This announcement reminded me of a question I occasionally ask myself, why aren’t there more women mentioned prominently in the nanotechnology/nanoscience narratives? There are a few (the ones I’ve heard of are from the US: Christine Peterson/Foresight Institute; Mildred Dresselhaus, advisor to former US Pres. Bill Clinton; Kristen Kulinowski/Rice University and the Good Nano Guide, please let me know of any others that should be added to this list) just not as many as I would have expected.

On a somewhat related note, there was this blog post by one of the co-authors of the article, The Internet as a resource and support network for diverse geoscientists, which focused largely on women,

In the September issue of GSA Today, you can find our article on The Internet as a resource and support network for diverse geoscientists. We wrote the article with with the idea of reaching beyond the audience that already reads blogs (or attends education/diversity sessions at GSA), with the view that we might be able to open some eyes as to why time spent on-line reading and writing blogs and participating in Twitter might be a valuable thing for geoscientists to be doing. And, of course, we had some data to support our assertions.

As a white woman geoscientist in academia, I have definitely personally and professionally benefited from my blog reading and writing time. (I even have a publication to show for it!) But I would to love to hear more from minority and outside-of-academia geoscientists about what blogs, Twitter, and other internet-based forms of support could be doing to better support you. As you can see from the paragraph above, what we ended up advocating was that institutional support for blogging and blog-reading would help increase participation. We thought that, with increased participation, more minority and outside-of-academia geosciences voices would emerge, helping others find support, community, role models, and mentoring in voices similar to their own. Meanwhile those of us closer to the white/academic end of the spectrum could learn from all that a diverse geoscientist community has to offer.

The 2-page article is open access and can be found here.

Meanwhile, women in technology should be taking this tack according to an article by Allyson Kapin on the Fast Company website,

We have a rampant problem in the tech world. It’s called the blame game. Here’s how it works. You ask the question, “Why aren’t there enough women in tech or launching startups?” From some you get answers like, “Because it’s an exclusive white boys club.” But others say, “Not true! It’s because women don’t promote their expertise enough and they are more risk averse.” How can we truly address the lack of women in tech and startups and develop realistic solutions if we continue to play this silly blame game?

Yesterday, Michael Arrington of TechCrunch wrote a blog post saying, “It doesn’t matter how old you are, what sex you are, what politics you support or what color you are. If your idea rocks and you can execute, you can change the world and/or get really, stinking rich.”

That’s a nice idea and if it were true then the amount of wealthy entrepreneurs would better match our population’s racial and gender demographics. The fact remains that in 2009 angel investors dished out $17.6 billion to fund startups. Wonder how many funded startups were women-run? 9.4%, according to the 2009 angel investor report from Center for Venture Research at University of New Hampshire. And only 6% of investor money funded startups run by people of color.

Yet Arrington says it’s because women just don’t want it enough and that he is sick and tired of being blamed for it. He also says TechCrunch has “beg[ged] women to come and speak” and participate in their events and reached out to communities but many women still decline.

Unfortunately, the article is expositing two different ideas (thank you Allyson Kapin for refuting Arrington’s thesis) and not relating them to each other. First, there is a ‘blame game’ which isn’t getting anyone anywhere and there are issues with getting women to speak on technology panels.There are some good suggestions in the article for how to deal with the 2nd problem while the first problem is left to rest.

Kapin is right, the blame game doesn’t work in anyone’s favour but then we have to develop some alternatives. I have something here from Science Cheerleader which offers a stereotype-breaking approach to dealing with some of the issues that women in science confront. Meet Christine,

Meet Crhstine (image found on sciencecheerleader.com

Meet Erica,

Meet Erica (image found on sciencecheerleader.com)

One of these women is a software engineer and the other is a biomedical engineer.  Do visit Science Cheerleader to figure out which woman does what.

Changing the way women are perceived is a slow and arduous process and requires a great number of strategies along with the recognition that the strategies have to be adjusted as the nature of the prejudice/discrimination also changes in response to the strategies designed to counter it in the first place.  For example, efforts like the L’Oréal fellowships for women have been described as reverse-discrimination since men don’t have access to the awards by reason of their gender while standard fellowship programmes are open to all. It’s true the programmes are open to all but we need to use a variety of ways (finding speakers for panels, special financial awards programmes, stereotype-breaking articles, refuting an uninformed statement, etc.) to encourage greater participation by women and the members of other groups that have traditionally not been included. After all, there’s a reason why most of the prominent Nobel science prize winners  are white males and it’s not because they are naturally better at science.