Protecting Privacy in Libraries as AI Adoption Accelerates

By: Anusha Nasrulai

Like picking what movie to watch, what restaurant to eat at, or where to go on vacation, what we read next is often recommended to us by personalization algorithms. Social media or reading platforms such as Goodreads already process user data to generate recommended content. Recently, the library catalog browsing app, Libby, announced its own book recommendation feature, Inspire Me.

Inspire Me by Libby recommends users’ books based on their own prompts or previously saved titles in the app. Originally announced as an optional feature, Inspire Me features prominently at the top of the home screen when users open the Libby App. The feature recommends books available through the catalogs of the libraries which users have linked accounts with. When the feature was first announced, users and libraries showed resistance, voicing concerns about forced AI adoption and diminished patron privacy. OverDrive, the parent company of Libby, states that readers’ personally identifying data and reading activity are not provided to the AI model.

Libraries work with vendor platforms, distributors, and publishers to deliver library services, particularly for e-materials. Despite popular backlash, vendors are expanding development of AI integrations. OverDrive CEO Steve Potash has announced goals to use AI to “match users to content across its platforms,” which also include streaming platform Kanopy, and k-12 education platform Sora. Other subscription vendor companies, such as OCLC, EBSCO, and Clarivate, have introduced AI features for content recommendation, enhanced search, text summaries, and AI-generated research assistants. Beyond externally marketed AI tools, vendors are incorporating AI into their internal workflows for “building, improving, and refining products.” Libraries now are finding the balance between their duty to protect patron data and privacy and providing access to digital resources.

Legal Regulations

The integration of AI by vendor platforms poses new privacy considerations for libraries. AI introduces new risk points at data collection, processing, training, and deployment.

The United States currently has no comprehensive AI or data privacy laws. Instead, states have passed dozens of laws regulating certain AI use cases. As of now, 6 states have passed cross-sectoral AI governance laws that apply to commercial entities. Vendors are likely subject to state-level AI and data privacy laws that target commercial entities. Libraries can leverage legal regulations to negotiate with vendors for stronger privacy protections. Trends in AI regulations show that states are increasingly passing and updating AI legislation amid legal challenges and an absence of federal regulation

AI Governance and Contracting

In light of legal uncertainties, contracts and licenses are a key opportunity for imposing guardrails on AI use. These agreements address how vendors and third parties can collect, process, and disclose user data.

More often, vendor agreements will not explicitly disclose internal use of AI tools or AI model training. Research and policy organization, Library Futures, and staff attorney, Layla Maurer, presented on this issue, flagging that broad language around operational mechanisms and data usage may permit vendors to train and deploy AI models using patron and institutional data. When reviewing vendor contracts for AI usage, libraries should focus on:

  • Vendor’s rights around data use and sharing, including with third parties. Use of patron data for “analytics” or “development and improvement of services” may include AI training.
  • References to third-party applications or tools, processors, or contractors necessary to carry out services under the agreement.
  • Whether there is a defined data retention period and what happens to patron data when the contract ends

Libraries can strengthen contract terms by including language requiring compliance with applicable federal and state laws, as well as with industry standards such as ISO and NIST. In addition, libraries may negotiate with vendors to:

  • Define user rights to data, including the right to opt out of nonessential data collection and the right to delete their data.
  • Limit secondary uses of data, including for training internal or external AI tools
  • Disclose third party partners and whether data is shared or sold to third parties
  • Conduct privacy and security audits
  • Establish a data retention period and protocol for destroying data at the end of the retention period

As said by attorney Layla Maurer, “Updating contract language to allow flexibility around software development needs while retaining safeguards for what the licensee… wants to protect is not just an expeditious way to reach an agreement with a software vendor, it’s also a strategy that helps ensure the licensee can continue to safely use the software despite future legislative changes provided the vendor updates their software in a manner consistent with the intent of the legislation.”

Future-proofing

Digital lending and services are a popular means of accessing materials from libraries, but at the same time, raise new challenges for protecting patron privacy. Therefore, as AI becomes embedded in services, libraries need to adopt AI guardrails in contracting to manage the harms and opportunities related to AI use in libraries, particularly around privacy.

Leave a comment