Decolonizing standards
Abstract
This paper is based upon the Miles Conrad Lecture presented by Dr. Safiya Umoja Noble at the 2023 NISO Plus conference on February 15, 2023. Through her writing, Speaking, and advocacy, Dr. Noble has drawn attention to the extensive biases and embedded racism that exist in many technologies that are used not only in scholarly research, but also increasingly in the wider community as well. This awareness has led to the beginnings of change in some areas of technology. Dr. Noble stresses that there is much work to be done do moving forward, but hopefully all those in the information community can collaborate to advance the banner of inclusivity in technological applications, particularly as it relates to information search and discovery.
On the occasion of this lifetime achievement award from National Information Standards Organization (NISO), I would like to say, “thank you”, and offer a provocation in the limited space I have to acknowledge this incredible honor by my professional field. I am calling this piece, “Decolonizing Standards”, and it truly is a provocation that I hope will help us open up new conversations and possibilities. As I write this, I want to first acknowledge that I am in Los Angeles, on the Gabrielino and Tongva peoples’ [1] lands who still steward and care for this area that we have come to occupy. As I work in this unceded territory, it is important for me to begin with this acknowledgement, because so many of the ways in which we move in the world, in our field, and in our profession around information rest on a whole set of relationships that often go unseen, underacknowledged, denied, and ignored.
I want you to go on a journey with me - a journey that I have also been on over the past few years of my career since the writing of Algorithms of Oppression: How search Engines Reinforce Racism [2]. I have to say thank you so much to this community for reading that work, and for taking my work seriously.
Let me tell you a little bit about how I got to this moment of receiving the 2023 Miles Conrad Award and my thinking about a “lifetime of achievement”. I should also note that I am fifty-three years old, which was Miles Conrad's age was when he passed. I have been thinking about that since I learned about this award and what it means to reach certain milestones in one’s career when you think that you are really just getting started. I am certain Miles Conrad himself thought he was just getting started in his work and in his contributions, nowhere near believing he had made the kind of impact he intended for the long haul of his career.
I think of my own work in the context of trying to be seen, and trying to have the legacies of erasure addressed, which was my focus as I was entering the field of library and information science as an under-represented Black woman. I went to grad school to think about how to ensure the knowledge of underrepresented communities could be kept alive in perpetuity. But, I left grad school relaizing that this endeavor would be left in the hands of technology companies rather than scholars and librarians if we were not dilligent in the questions we asked and the projects we supported in the field. This period of time I was in graduate school at the University of Illinois Urbana-Champaign coincided with the rise of commercial search engines such as Yahoo! [3] and Google [4], and while I was there, I began connecting the way people talked about these new emergent technologies with the narratives around earlier incarnations of the internet. I was questioning the way we circulate information, and how we would manage the future of knowledge for people of the global majority in institutions that struggled to include our voices, perspectives, and epistemologies across every discipline and domain. I was worried about the slippage of these responsibiilities as they moved further into the corporate sphere, controlled and determined by technologists, not librarians, teachers, professors and researchers or subject matter experts of all types. I was connecting these dots to broader projects of builidng equitable, multi-racial democracies that challenge concentrated power among tech elites and corporations.
But these questions started much earlier than grad school, they were percolating as I came of age online, witnessing the battle between early internet users and companies who began shifting the interent into a wholly commercial space. I started using the internet before there was a graphic-user interface, which was before Mosaic [5], Netscape [6], and AOL [7] - it was the text-based internet. That particular early information environment focused on cultivated expertise and finding people in chat rooms who might be experts on a subject. What these sites of expertise offered was clarity about the subjective nature of information-seeking. These were networking spaces in many ways for a lot of people, and many of us witnessed a slow shift to the emerging commercial web--the web we now experience that is almost exclusively mediated and controlled by large corporations who index and organize information for us, or who build out spaces to chat and connect within the boundaries of their global platforms. The internet of today is heavily surveilled and subject to zero standards or regulation, using almost no professional standards for moderating content, and having no accountability for providing quality information or journalistic integrity.
We who work in information standards, classification, and organization have a much more sophisticated understanding of the technologies, protocols, operability, and lack of interoperability among different kinds of technical systems. However, when you think about the way the public generally engages with these emergent systems, they know so little about their structure and organization. In fact, over the last two decades, we have seen how people place enormous trust in those who build these systems. They assume that those behind the scenes are making the right choices for them and have figured out how to create the best twenty results that they could ever want out of millions of potential websites.
Part of the reason why they trust those results is because oftentimes the things they find align with what they are looking for. For example, if they search for something banal such as, “Where is the nearest coffee shop?” commercial search engines will likely give them the right answer. However, once we look at more sophisticated questions about society or knowledge that have many global vantage points, we start to see that those results might be incredibly subjective, have a very clear point of view, and prioritize or privilege certain worldviews over others. We are seeing headlines every week of new interfaces to commercial search engines, such as Generative AI chatbots like Chat-GPT providing profoundly troublesome results and recommendations that dehumanize, decontextualize, and recommend and promote ideas that are either factually and/or morally wrong or destrucitve. We are living through a time where people are encouraged to relinquish decision making and expertise to search engine chatbots rather than avail themselves to the much more difficult process of seeking truth, or fairly deliberating over the most difficult of matters. These abdications of struggle over the standards and values by which we live, in the most just and humane ways possible, are increasingly left to machines to set for us, and on this point, I feel that I am only at the beginning of what may be a long and arduous task of making this understood, rather than at the end of a lifetime of work on these problems.
This is one of the reasons that I study the banal, everyday spaces with which the public engages, rather than the usual knowledge spheres that experts or academics contend with. It is vital that we pay attention to these spaces because these technologies reshape and remake the world, and they are what is available to the general public.
So let me take you through some of the things that I have been thinking about since the writing of Algorithms of Oppression. These are ideas that I am working on now that I think could be a contribution to our field and the way that we engage with serious commitments to things such as diversity, equity, and inclusion. For me, these ideas start with some hard truths.
First, I want to acknowledge Chanda Prescod-Weinstein’s work, particularly her influential “Decolonising Science” reading list on Medium [8]. Her writing has been transformative in opening up and furthering conversations about what it means to decolonize science. I connect her ideas to decolonizing scientific standards or other kinds of information standards, because not only is that area my wheelhouse, but also because it is such a significant set of logics upon which so many forms of knowledge rest. I also deeply appreciate the provocations of Rachel Adam's work on whether it is even possible to decolonize AI [9], which raises many important issues that resonate with the questions I'm asking in my work now.
One of the things that is important to note is that we are working in a field that has not made its political commitments entirely clear at all times. For example, we have ideas about access to information and knowledge, and yet we also have institutions, efforts, and histories that have limited scientific information to very specific groups of people and nation-states at the expense of and in a direct effort to preclude others from gaining access to that knowledge.
My hope for my current book project is that it will be an alternative history to the internet that we can read alongside the histories that we hold up and revere in the United States, particularly narratives that focus on great white men of science and the standards, models, and people that somehow keep us from seeing other people and ideas that were present in the milieu over time.
One of the things that we know, for example, is that there were early antecedents to technologies such as the internet that were incredibly influential, such as Vannevar Bush [10] and his early memex [11] machine in 1945. These projects of early scientific librarians and information scientists were partially borne out of the first two world wars. During World War II in particular, major advances in science led to weapons and military technologies that threatened the planet. It is important to situate these early developments in information science to the world we live in now, because much of that time was about ensuring that scientists in what was then called the Third World or among the Non-Aligned Movements [12] did not have access to those military advancements. These are the nations of color around the world, all throughout Africa, colonized countries in South America, in the Caribbean, in the Polynesian islands, and in Asia--places where colonial powers were committed to ensuring that scientific knowledge around what could be advanced militarily would not fall into the hands of those scientists. This was a crucial moment from which organizations such as the National Science Foundation (NSF) [13] came into existence in 1950. In that Cold War moment, the United States and its allies were concerned with how scientific knowledge moved around, who had access to it, and making sure that certain peoples who were considered enemies of the United States did not have access to it.
Now, what does that have to do with today? We are living in a moment where we are still dealing with the effects of colonization and occupation of Indigenous Peoples’ lands. As a field, we have struggled to name our place and figure out our way to contribute to reimagining a more fair and just world. This is at the heart of what it means when we talk about things such as diversity, equity, and inclusion, because those notions come out of a post-civil rights moment in the United States, which is also connected to liberation movements around the world that are trying to reject colonization and occupation, that are trying to reclaim control over their natural resources, and that are trying to assert independence and liberation and have a fair and full footing with more powerful nation-states, governments, and peoples around the world.
In the United States in the 1950s and 1960s, when the scientific communities were responding to the Soviet Union and launching rockets into space, we also had incredible experiences of unrest and dissatisfaction where the American public was saying, we want to live in a world where everyone has equal rights and civil rights, where women have equal rights, and where women of color and Black women also have the right to vote. I think so much about what it might have been like for us to interrogate our role as a community - as a scientific community and a standards community - in the milieu of the times that have really brought us to this moment.
These are some of the provocations I hope will get our synapses firing and reframe our thinking around what our projects are in the world and on what we are truly working. We live in a moment where we have unprecedented inequality. From the time we have kept records of inequality, every year, we outpace the previous one in terms of global economic inequality. We are living in a moment where scientific knowledge about how to solve that kind of problem is so important to measure so we can understand things such as “quality of life” around the world, with shared understandings of what a high-quality of life means.
We have to contend with what it means to live on a planet that we share and where we are interconnected and interdependent. I believe information professionals and workers have to think about our practices in terms of the immorality of the current distribution of resources around the world. It's imperative that we keep reconsidering the role of technologists in obfuscating and distracting from so many crises that get flattened into data sets and datafied systems that strip us from a moral compass and a sense of the lives and harms that are tied to such of the work we do, whether we realize it or not.
We also live in a time where we have new conceptions of phenomena such as climate change that have been deepend and made more urgent by the people who are most negatively harmed by it. In fact, they have given us terms like ``climate justice" to frame the climate crisis with a power analysis that must center words like justice in order to solve. This is quite different from the times when people spoke of the environment or environmental catastrophe in terms of increased pollution, or holes in the ozone layer. What we have learned from climate scientists now is that their vocabularies about climate change and the places where they go to study and think about the climate have completely changed. Part of the reason for that change has been the demand that communities that have been overlooked by scientific interventions be included. We need to learn from, and center the most urgent and impacted voices and communities to help us reframe our work, too.
Now we have ways of thinking about climate change where we center people, such as those who live in Polynesian islands that are environmentally degrading and disappearing due to rising sea levels. We center people who are losing the places where they have lived for generations to fires, to land that cannot cultivate crops any longer, to changes that render the water either non-potable or inaccessible. Places that were once pristine and lush, where people lived and made community for thousands of years are no longer available to human beings. These developments have changed the way we think about the use of science by prioritizing people who are most vulnerable to climate change. This is a powerful example of what it means to decolonize climate science by putting those who would be most harmed or most affected at the center, and is a current example for us to study and learn from.
We also have issues and concerns about educational equity and access. For example, within the Los Angeles Unified school district, there were some forty-thousand students who never once logged in to their remote classes during the lockdowns for COVID-19 and therefore were unable to continue to advance their education. This is one illustration of the complications that we have around models of delivery that we developed and which no longer make sense as we navigate a changing world that is increasingly vulnerable to viruses and other epidemics that affect millions of people.
We have to think about what it means to make educational equity real and knowledge more available. Given the number of students who use commercial search engines such as Google to answer questions and find information, it is vital that we consider what it means for large swaths of scientific research and knowledge production across the humanities, social sciences, and arts to live behind paywalls to which only scholars and students at universities have access. We have a role to play in thinking about the changing nature of these systems that our own children and our neighbors’ children are reliant upon. In many cases these systems are being defunded, devalued, and in some instances, crumbling or unavailable. What is our role as information professionals and information scientists in remaking and opening up access to knowledge?
One of the things that I thought was so interesting is that the most searched for terms in Google search are for health-related information. This is so telling and important for us to watch and to think about, especially for those of us in the United States, where we do not have nationalized healthcare. When people do not have access to healthcare services, they will go on websites such as WebMD [14] and try to self-diagnose themselves. This is also tied to the billion-dollar wellness industry that is fraught with nonscientific snake oil and dangerous ideas about how to protect oneself, such as the false claim that ivermectin, a drug typically used on horses, is effective against COVID-19 [15]. So, these are the kinds of opportunities that I think are before us.
I would also like to address the issue of standardizing people. We have spent hundreds of years as scientists organizing as many things on Earth and in space as we can into elaborate scientific regimes about the world. This political set of processes also emanates from the earlier time periods I mentioned under colonization, where many of the taxonomies of species created were also used in service of organizing people into racist hierarchies. Whether it is cranial measurements, measuring melanin in one’s skin, or studying the width of one’s face or brain size; those kinds of eugenicist scientific markers have been used to determine and declare who is the superior class of human being. All of that information is also part of histories of standardized information and science, which has overdetermined and shaped the world that we inhabit. There are many people now working to refute, dislodge, speak back to, and disprove these racist, sexist, and heteronormative ideas about the standards by which we measure people. These are fundamentally enethical and dangerous projects that underlie our field, which lay the epistemological framework for our work, which we must contest and reimagine.
We are also in a moment where we are contending with not only the standardization, but the making of models that are used to organize human beings and determine their worthiness in terms of access to a variety of services and systems. For example, we have models to recognize our faces and ones that determine our credit worthiness. We have models that determine whether we should be admitted to college or not and whether we will be successful there. We have models that determine whether we should get a mortgage or a loan. Moreover, many of these models are increasingly moving away from algorithmic formulations based on long histories of flawed datasets (ones that preclude people and flatten the richness of our lived experiences), and are moving into a new era where machine learning ingests so much data and sorts us in a myriad of different ways that we might not be able to make sense of it all or intervene. This includes recent hot-button projects such as ChatGPT [16] and OpenAI [17], but it is imperative that we consider the underlying logics of social organization that is happening as they classify people into or out of different opportunities.
And for me, this is actually the place of my life’s work. I feel these are some of the most rich and interesting questions that are taking place at a time when we do not have other systematic protections, laws, customs, or practices to help us contend with, corral, govern, and make sense of the consequences of so many of these new projects.
This is the area where I think the field of information standards has much to offer. We should be at the forefront, and not passively allow our work to bolster colonial projects of the past, even if we did not know it was happening. Once we know, we have a responsibility to work with the knowledge to which we are introduced and to contend with it and grapple with it.
This does not mean that I have a prescriptive answer. What we are engaged in together is a process of figuring out what has come before us, what we have inherited, and what we are contributing to today. How might we remake some of the assumptions upon which we do our work? How might we reimagine business models? For example, are there business models upon which publishers and other kinds of libraries and institutions can make knowledge, especially scientific knowledge, accessible to the world, while also making space for new epistemology and new ways of thinking? How can we hold space to both critique what has happened and be more expansive about the kinds of futures that we want?
These are not unconscious actions. These are opportunities that we can embrace. For me, these issues are what I think about as I ask what it means to decolonize our field and our standards. It means that we question the assumptions upon which the scientific world is organized. We reimagine new possibilities. And of course, nowhere is there more of a sensibility that science evolves, that people evolve, that we discover and are introduced to new things than among scholars and scientists of all types. We observe new things and incorporate them, which means we do not have to be static. We do not have a static field, nor do we have a static orientation to the world in which we live.
We are living in a moment where there is a great consolidation of power, which brings up questions of who gets to organize and control knowledge and information. These questions are difficult to grapple with. In Europe and the European Union, there is more regulation and concern than in the United States about what it means for U.S. companies to control knowledge and information regimes in other parts of the world. We have narratives about the continent of Africa and we have narratives about South America that are too narrow and that do not properly account for the voices and the concerns coming from and emanating from these parts of the world. We have much to learn from places like Brazil and Rohingya people in Myanmar about the possibilities of reimaginging both digital rights, and the demands for real-world material repair from digitally-induced harms.
This is an incredible moment to be alive in our field and doing this work because we have an opportunity to be expansive about how we think about tinformation practice, and what for example, feminist ways of knowing and thinkiing could offer. I have tried to center Black feminist methods in my career and I believe it has opened up new ways of thinking and possibilities for impact. We could re-imagine these words in a way that centers the concerns of people who have both been victims of past practices, but also who have incredible ideas about how to find a way forward into a fairer and more just world and feministm and ethnic studies fields have a lot to teach us to move us forward.
We have barely scratched the surface. So many technologies and systems are changing, and just as we start to understand them, they evolve even more. New players enter into the marketplace, and different voices emerge and move forward to take the microphone for a minute. There is a great opportunity to bring hard conversations into the foreground; to rethink how we organize, disseminate, and share knowledge and information around the world; and to put science, information, and knowledge in service of the most pressing issues that there are.
At the end of the day, I feel that I am nowhere near a lifetime of achievement. I am just starting to have the kinds of conversations that are important to me. I am very grateful that you have given me this space to share some provocations for our field. I embrace this award because I think that for you to give it to me, knowing the things that I write about and the things that I work on, signals that there is a greater opportunity to bring hard conversations into the foreground, and to rethink how we organize and disseminate and share knowledge and information around the world.
About the author
Dr. Safiya Umoja Noble is the David O. Sears Presidential Endowed Chair of Social Sciences and Professor of Gender Studies, African American Studies, and Information Studies at the University of California, Los Angeles (UCLA). She is the Director of the Center on Race & Digital Justice and Co-Director of the Minderoo Initiative on Tech & Power at the UCLA Center for Critical Internet Inquiry (C2i2). She currently serves as Interim Director of the UCLA DataX Initiative, leading work in critical data studies for the campus. She is the author of the best-selling book on racist and sexist algorithmic harm in commercial search engines, entitled Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press), which has been widely-reviewed in scholarly and popular publications. In 2021, she was recognized as a MacArthur Foundation Fellow for her ground-breaking work on algorithmic discrimination. She is the co-editor of two books: The Intersectional Internet: Race, Sex, Culture and Class Online (Peter Lang, Digital Formations, 2016), and Emotions, Technology & Design (Elsevier, 2015). Safiya holds a Ph.D. and M.S. in Library & Information Science from the University of Illinois at Urbana-Champaign. Website: https://safiyaunoble.com
References
[1] | See: https://en.wikipedia.org/wiki/Tongvahttps://gabrielinotribe.org, both accessed September 20, 2023. |
[2] | See: https://en.wikipedia.org/wiki/Algorithms_of_Oppression, accessed September 20, 2023. |
[3] | See: https://en.wikipedia.org/wiki/Yahoo!_Search, accessed September 20, 2023. |
[4] | See: https://en.wikipedia.org/wiki/Google_Search, accessed September 20, 2023. |
[5] | See: https://en.wikipedia.org/wiki/Mosaic_(web_browser), accessed September 20, 2023. |
[6] | See: https://en.wikipedia.org/wiki/Netscape_(web_browser), accessed September 20, 2023. |
[7] | See: https://en.wikipedia.org/wiki/AOL, accessed September 20, 2023. |
[8] | See: https://medium.com/@chanda/decolonising-science-reading-list-339fb773d51f, accessed September 20, 2023. |
[9] | R. Adams, Can artificial intelligence be decolonized? Interdisciplinary Science Reviews 46: ((2021) ). |
[10] | See: https://en.wikipedia.org/wiki/Vannevar_Bush, accessed September 30, 3023. |
[11] | See: https://en.wikipedia.org/wiki/Memex, accessed September 20, 2023. |
[12] | See: https://en.wikipedia.org/wiki/Non-aligned_movement, accessed September 20, 2023. |
[13] | See: https://www.nsf.gov, accessed September 20, 2023. |
[14] | See: https://www.webmd.com, accessed September 20, 2023. |
[15] | See: https://www.fda.gov/consumers/consumer-updates/why-you-should-not-use-ivermectin-treat-or-prevent-covid-19, accessed September 20, 2023. |
[16] | See: https://en.wikipedia.org/wiki/ChatGPT, accessed September 20, 2023. |
[17] | See: https://openai.com, accessed September 20, 2023. |