I have watched the growth of critical librarianship into a mainstream movement with excitement. Critical librarianship supports the belief that, in our work as librarians, we should examine and fight attempts at social oppression.
Over the past few years, critical librarianship has become a force that pervades every area of our work, from reference to library instruction, collection development, cataloging, and storytime. Biweekly #critlib Twitter chats (critlib.org) address topics across all areas of librarianship. Many librarians are thinking about how they can fight for social justice in their work, which raises the question of whether that work reflects the neutrality that has long been a value in our profession.
One tenet of critical librarianship is that neutrality is not only unachievable, it is harmful to oppressed groups in our society. In a world that is fundamentally unequal, neutrality upholds inequality and represents indifference to the marginalization of members of our community. If the majority of what is published represents a white, male, Christian, heteronormative worldview, then we are not supporting the interests of other members of our communities by primarily buying those works. If Library of Congress Subject Headings whitewash injustices like Japanese-American internment with terms like “Japanese Americans–Evacuation and relocation,” then neutrality supports it.
Only recently have I begun to see critiques of technology through the lens of critical librarianship. It is tempting to believe that searching in our library discovery systems is somehow more neutral because results are ranked by relevance based on an algorithm. But that’s far from the case, as many researchers have suggested that algorithms can reproduce and support the same negative stereotyping that occurs in society. Grand Valley State University librarian Matthew Reidsma published a fascinating analysis of bias in library discovery systems and explored some of the biased and offensive search results he found in his library’s discovery layer.
Neutrality is not only unachievable, it is harmful to oppressed groups in our society.
Whether or not we improve library discovery systems, our patrons are still using online tools that have been shown to be biased. In his TED talk “Beware Online Filter Bubbles,” Eli Pariser showed how tools like Google and Facebook personalize what users see and primarily show content that agrees with their worldview. Information scientist Safiya Umoja Noble studied Google search results to show the damage done to young black girls when the top results in a search for “black girls” are highly sexualized. When the internet tools we use provide a partial or distorted view of the world, they influence the attitudes, actions, and self-esteem of the people we serve.
Librarians may not be able to change Google or Facebook, but we can educate our patrons and support the development of the critical-thinking skills they need to navigate an often-biased online world. We can empower our patrons when we help them critically evaluate information and teach them about bias in search engines, social media, and publishing. Libraries that have created research guides for Black Lives Matter and other social justice topics are helping to curate information and points-of-view that might not be traditionally published.
We are not doing our job if we remain neutral when it comes to library technologies. Accepting many of the technologies available to support our missions means accepting technologies that are biased, not accessible, not protective of the privacy of our users, and not easily usable by some of our patrons. A commitment to social justice is a commitment to equal access, which is at the heart of our professional values. We are not being neutral when we advocate for our patrons, but we are being good librarians.