Best Practice:  Promote Inclusivity When Using AI

Instructors should teach learners that generative AI tools have the potential to produce results that reflect or even amplify societal biases and may not be accessible to all users (or may not produce content that is accessible). Learners should be taught how to account for these issues when using generative AI. 

Applicable Principles

Inclusion, Equity, and Access

  • Instructors and learners should be aware of potential biases in their use of AI.
  • Instructors and learners should have equitable access to generative AI tools and their needs should be accommodated.

Generative AI Literacy

  • Instructors and learners need to understand how generative AI works.

Academic Integrity

  • Instructors and learners must acknowledge when and how they have used AI in learning
  • Learners are responsible for doing the work to learn.

Practical Application (the what)

Instructors should teach learners about the ways that generative AI can produce biased results. All algorithms create a risk of biased results. Algorithms used in criminal sentencing, determinations of pretrial release, mortgage lending, facial recognition, hiring, and many others have been shown to produce results biased against particular demographic groups. Generative AI tools are no different. Instructors whose learners rely on these tools can mitigate these harms. Biases may be produced by biased training data. Learners should be taught, for example, that some generative AI models are trained on text found at sites like Reddit and Wikipedia, which contain user-generated text that contains derogatory language or discriminatory attitudes. Instructors can also use generative AI tools to help learners see biases that they might have missed. One way to do this is to use different AI models on the same problem. Different models will generate different results, and comparing the results will often bring to the surface any biases in one of the results. Finally, instructors must ensure that their courses allow learners to raise questions about possible biases.

Accessibility experts can be called in to make sure tools and interfaces meet quality standards, as well as provide suggestions for remediation. Testing can be done to make sure a variety of individuals can use the tools and provide feedback to developers. Output formats can be adjustable for screen readers and images should always have alternative text descriptions. Other features should include simplified language options, captioning and audio descriptions, readability adjustment, and feedback mechanisms for tool refinement. 

Resources