• Su Ning Goh

Technology and Representation

Photo by Alex Knight on Unsplash

The Digital Revolution brought about the ubiquity of technology, a force that has become equally revered and feared. It is indisputable that technology has and will continue to transform our lives. In this light, we need to consider the implications of the huge role technology has to play. From the psychological effects of addiction to technology, to the moral consequences of artificial intelligence, there are a great many issues surrounding technology covering different facets. In my view, the issue of inclusivity and representation is one of these questions that need to be critically addressed in the discussion of technology.

It is a common fallacy that computers are objective and therefore better than humans at making decisions (1). Humans are fallible — emotions can cloud our judgement, as well as various mental heuristics. In contrast, artificial intelligence follows a set of clear-cut rules defined by its programmers. There should be no room for bias.

In reality, the bias comes at the inception of the program. It is naive to think of these tools as “neutral” (2). The impact of these algorithms is unavoidable: they are being applied in different ways — for instance in advertising, recruiting, and pricing strategies — across all sorts of different industries. Consider the accidental racism of Google’s facial recognition software in 2015, where it identified two African American people as “gorillas” (3). Or, for a mistake with further reaching consequences: when it was discovered that COMPAS (a popular risk assessment analysis tool used by judges to predict whether an offender would commit a crime again) labelled black defendants to be at a higher risk of recidivism than what transpired in reality, as compared to white defendants, whose risk level was underestimated (4).

Blind faith in imperfect technology has very real implications. The results COMPAS churns out affects the sentences doled out to the defendants. When a Wisconsin convict challenged his 2-year sentence, the judge admitted that he would have granted him a shorter sentence, had it not been for COMPAS (5). As such technology continues to become more mainstream, the problem of bias urgently needs to be acknowledged and fixed (6). This is especially so as technology tends to be esoteric, and the area of machine learning even more so. This makes the algorithms opaque and impossible to scrutinize — the laypeople who would use this technology would be unable to recognize the inherent biases, let alone have the technical know-how to correct them (7).

Where does this bias come from? It boils down to the lack of representation behind the computer screen. On one level, much has been said about the homogeneity of the tech industry. Margaret Mitchell, a Microsoft researcher, describes the industry as a “sea of dudes” (8). Racial diversity is also lacking in the industry, even more so than gender diversity. Only 2% of Google’s workforce are African-American (9). People of colour are far less likely to be in a leadership position than white men or women -- for instance, white women were 91 percent more likely than African-American women to be executives in the tech industry, 178 percent more likely than Hispanic women, and 246 percent more likely than Asian women (10). Less visible, but equally important, is the lack of diversity in opinion. One of the criticisms James Damore makes in his infamous Google memo is that the company has an aggressively leftist culture, where employees who hold a contrary opinion were shunned and shamed into the fold (11). While some views are clearly abhorrent (in this case, that women are inferior to men in the field), a case can still be made for welcoming perspectives that challenge the norms that we, as privileged educated young people, take for granted. Diversity matters because it can help to ensure a comprehensive set of data for algorithms to build on (12), and to serve as an internal check to maintaining neutrality. As computer science Professor Tina Eliassi-Rad of Northeastern University puts it, “Part of the problem in creating fair algorithms is the concept of fairness itself” (13).

The technology of machine-learning algorithms and their use in automated decision-making systems reveal that technology can perpetuate the disempowerment of underrepresented groups in society. Technology needs to be inclusive and representative for it to be a force for good . This is particularly evident in developing countries.

The participation in technology and the digital world alone has an empowering effect. Digital literacy, the ability “to find resources, critically evaluate and create information, and to do this by using digital technology”, is the key to unlocking the benefits that access to the Internet bring about (14). For instance, access to the repository of freely available information on the internet can radically change the way scientists and researchers in developing countries work, as they otherwise face resource limitations to conduct research (15).

Information technology can also be critical in bridging the gender gap, which is a significant problem in developing countries, where women are typically less educated than men and are expected to take on traditional roles. As a result, they suffer from “information isolation”, where they are uninformed of alternatives. This furthers their dependence on the males in their lives, who experience more freedom as a result of gendered cultural expectations (16). This is an even greater issue for women in rural areas, where the geographical isolation further limits their sources of knowledge. Information isolation has implications on women’s social, economic and political emancipation. Through information technology, however, this isolation is alleviated — communication is better facilitated, allowing valuable information about health, employment and other relevant topics to be quickly and easily disseminated to this previously inaccessible group (17). Technology can provide a direct solution to gender inequality too. The recent news that girls in a Ghanaian village have been banned from crossing a river while they are menstruating, thereby causing them to miss school (18), is an example of the cultural barriers to women’s education. With technology, learning does not have to be confined to the physical classroom (19), giving girls in developing countries a chance to receive education in cases such as this.

While remedying gender inequality is a noble end unto itself, it also has the knock-on benefit of spurring economic growth. The relationship between gender equality and economic growth is well-known — giving women the ability to become economic agents helps to increase a country’s human capital endowment, and therefore their productivity and economic growth (20). Another way that developing countries can benefit economically is by riding on the general trend of the burgeoning tech industry. Technology is a key driver of economic growth — the industry alone has a value that amounts to USD$6 trillion globally, which would rank it as the third largest economy in the world (21). Spurred on in part by the availability of relatively cheaper labour, companies such as Texas Instruments and Microsoft have set up R&D facilities in India (22). As such, the boon of technology has led to job creation in developing countries, economically empowering them should they be able to position themselves for it. On the African continent, countries like South Africa, Nigeria and Kenya have ambitions for a digital economy (23). The rise of coding schools and greater availability of digital skills development programs is one answer to the shortage of qualified developers (23).

Inclusivity matters in determining how technology can empower or disempower. It is important in the creation stages of new technologies, because of the universality of the technology’s use in spite of the small sliver of society that creates it. The transformative power of technology makes the accessibility of it crucial for marginalized groups, such as women, or even for developing countries.


1. Madrigal, Alexis. "Against 'Objective' Algorithms: The Case Of Google News." The Atlantic, 2012, https://www.theatlantic.com/technology/archive/2012/12/against-objective-algorithms-the-case-of-google-news/266137/.

2. Byrnes, Nanette. "Are Machine Learning Algorithms Biased?." MIT Technology Review, 2016, https://www.technologyreview.com/s/601775/why-we-should-expect-algorithms-to-be-biased.

3. Nicodemo, Allie. "How To Create Unbiased Algorithms In A Biased Society." News @ Northeastern, 2017, https://news.northeastern.edu/2017/12/how-to-create-unbiased-algorithms-in-a-biased-society/.

4. Angwin, Julia et al. "Machine Bias." Propublica, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

5. Ibid.

6. Knight, Will. "Google’s AI Chief Says Forget Elon Musk’s Killer Robots, And Worry About Bias In AI Systems Instead." MIT Technology Review, 2017, https://www.technologyreview.com/s/608986/forget-killer-robotsbias-is-the-real-ai-danger/.

7. Ibid.

8. Byrnes, Nanette. "Are Machine Learning Algorithms Biased?." MIT Technology Review, 2016, https://www.technologyreview.com/s/601775/why-we-should-expect-algorithms-to-be-biased.

9. Wong, Julia. "Segregated Valley: The Ugly Truth About Google And Diversity In Tech." The Guardian, 2017, https://www.theguardian.com/technology/2017/aug/07/silicon-valley-google-diversity-black-women-workers.

10. Mock, Brentin. "In Silicon Valley, Racial Disparities Are Even Worse Than Gender Disparities." Citylab, 2017, https://www.citylab.com/equity/2017/10/when-it-comes-to-tech-racial-disparities-are-far-worse-than-gender-disparities/542013/.

11. Bershidsky, Leonid. "If Google Is Biased, So Are Its Algorithms." Bloomberg, 2018, https://www.bloomberg.com/view/articles/2018-01-09/if-google-is-biased-so-are-its-algorithms.

12. Ito, Robert. "For AI, A Real-World Reality Check." Google, https://www.google.com/intl/en/about/stories/gender-balance-diversity-important-to-machine-learning/.

13. Nicodemo, Allie. "How To Create Unbiased Algorithms In A Biased Society." News @ Northeastern, 2017, https://news.northeastern.edu/2017/12/how-to-create-unbiased-algorithms-in-a-biased-society/.

14. Antonio, Amy, and David Tuffley. "Digital Literacy In The Developing World: A Gender Gap." The Conversation, 2014, https://theconversation.com/digital-literacy-in-the-developing-world-a-gender-gap-28650.

15. Cullen, Rowena. "Addressing The Digital Divide." Online Information Review, vol 25, no. 5, 2001, pp. 311-320. Emerald, doi:10.1108/14684520110410517.

16. Hafkin, Nancy, and Nancy Taggart. Gender, Information Technology, And Developing Countries: An Analytic Study. Learnlink Project, 2001.

17. Ibid.

18. Sharman, Jon. "Menstruating Girls 'Banned From Crossing River To Get To School'." The Independent, 2018, http://www.independent.co.uk/news/world/africa/menstruating-girls-river-cross-ban-ghana-school-attend-periods-a8156141.html.

19. WL Fong, Michelle. "Digital Divide: The Case Of Developing Countries." Issues In Informing Science And Information Technology, vol 6, 2009, pp. 471-478. Informing Science Institute, doi:10.28945/1074.

20."The Case For Gender Equality." World Economic Forum, http://reports.weforum.org/global-gender-gap-report-2015/the-case-for-gender-equality/.

21. Cavallo, Marco. "The Growing Importance Of The Technology Economy." CIO, 2016, https://www.cio.com/article/3152568/leadership-management/the-growing-importance-of-the-technology-economy.html.

22. Archibugi, Daniele, and Carlo Pietrobelli. "The Globalisation Of Technology And Its Implications For Developing Countries." Technological Forecasting And Social Change, vol 70, no. 9, 2003, pp. 861-883. Elsevier BV, doi:10.1016/s0040-1625(02)00409-2.

23. Chutel, Lynsey. "South Africa Doesn’T Have Enough Developers To Build A Digital Economy." Quartz, 2016, https://qz.com/714662/south-africa-doesnt-have-enough-developers-to-build-a-digital-economy/.

49 views0 comments

Recent Posts

See All