The Top Tech Trends on-demand session at the American Library Association’s Midwinter Meeting & Exhibits Virtual took a different tack this year. Rather than highlight a range of new services or gadgets, moderator TJ Lamanna, emerging technologies librarian at Cherry Hill (N.J.) Public Library, stated that the panel’s goal this time was to talk about the dangers inherent in some of the technologies that libraries might be interested in using.
John Mack Freeman, manager of Gwinnett County (Ga.) Public Library’s Suwanee branch, opened with a clip that appeared to show former US President Barack Obama voicing support for Black Panther movie villain Erik Killmonger. (The clip was actually a BuzzFeed-produced PSA from director Jordan Peele, who imitated Obama’s speech patterns with cleverly synched video.)
It was a jarring example of a deepfake. Deepfakes use artificial neural networks to replace a person in an existing image or video with someone else’s likeness. Good deepfakes are expensive to produce and require technical skill and advanced processing. Deepfakes produced through free, readily available programs like social media filters and face-swaps are often called “cheapfakes.”
Freeman said deepfakes themselves don’t pose a threat and are not always produced for malicious intent. Face-swap apps or technology that inserts younger versions of actors into movies are purely for entertainment purposes, for example, while AI developments in health care can digitally replace the voice of someone with a degenerative disease like ALS. However, there are many problematic uses, such as involuntary pornography, financial fraud and phishing, and political misinformation designed to influence elections.
The main danger of deepfakes is that they create distrust in forms of media—namely, photos and videos—that were traditionally trusted. This makes information literacy more complex; librarians can’t just tell students and patrons to think critically. “In my opinion,” Freeman said, “our current information literacy tools are not up to the challenge of widespread synthetic media.”
Callan Bignoli, library director at Olin College of Engineering in Needham, Massachusetts, focused on COVID-19’s effects on technology and surveillance. She began by recommending the books The Age of Surveillance Capitalism, Automating Inequality, Dark Matters, and Race after Technology.
She noted that we don’t have much control over the ubiquitous technology that is surveilling and tracking us. Librarians often find themselves in the position of training people on technology because tech companies are not transparent about how their products work, leading to a division between people who understand the threats of tracking and surveillance systems and those who don’t. “It is a bit of a luxury to keep yourself safe,” she added. Other highlights from her presentation:
- Normalized surveillance. Data from security cameras, even those at libraries, can be used for facial recognition, a technology that is gaining popularity but doesn’t work perfectly, particularly when identifying people of color. False matches can lead to arrests and deadly police encounters.
- Data retention. Should we have a permanent record? Bignoli argues no. “Right to be forgotten” laws in other countries allow private information to be removed from online databases, but in the US doing so is considered an affront to free speech.
- Disaster capitalism during COVID-19. Rushing to help libraries through the pandemic downturn are vendors and technologists such as the Massachusetts company that used prison labor to create acrylic shields for libraries.
Bignoli concluded that librarians should always err on the side of privacy over convenience and open discussions with colleagues and patrons about these issues through means such as book clubs, programming, and advocacy work.
Jeanie Austin, who earned their PhD in library and information science from University of Illinois at Urbana–Champaign, has worked in jails and juvenile detention centers and now studies how power functions in carceral spaces. Often mischaracterized as being technologically empty, these spaces are heavily technologically mediated, they said.
Biometric technologies coming into common public use have been trained on people who are incarcerated, Austin said. For example, in some prisons, inmates have had to submit voiceprints to have access to phone calls, sometimes without being aware that this data is collected. Video visitation software, facial recognition, and fingerprint/palm scanning are other examples.
Austin cited several works from their research on surveillance in prison, including Dark Matter, Carceral Capitalism, and Algorithms of Oppression. Even among people who are against mass incarceration, Austin noted, there is often a push for more technology, such as tablets that can ostensibly be used for rehabilitation or at least to occupy the time of people who are incarcerated. But the information available on tablets is controlled by the prison or the corporation that owns the technology.
With the help of government and police agencies, private companies that contract with prisons create large-scale databases of information—databases that contain information on people in prison (as well as loved ones who contact them) and that are heavily biased by class and race. Austin’s research looked at patents for these technologies and how the databases maintained outside prisons create what they called “further mechanisms of control and constriction.”
Austin noted that this subject is relevant to librarianship not only because it’s related to information, but because some of the early technologies that laid the foundation for this surveillance were funded by the same government grants that funded prison libraries in the 1960s and 1970s. They concluded by asking, “How do we create means of providing information that do not further bolster this type of racialized and criminalizing surveillance?”