When ChatGPT hit the web in the fall of 2022, jaws dropped across the globe. Suddenly, AI chatbots had gone from laughable to impressive—and a little bit scary. People began predicting that AI would kill education as we know it (we promise that it won’t). Teachers worried that students would use AI to do their homework and write their essays. And schools realized that it was probably time to add an official AI policy to their handbooks.
What is AI-generated content?
Not entirely sure what AI (artificial intelligence)-generated content is? Let’s ask ChatGPT itself! Here’s what the program said in response to the question, “What is AI-generated content?”:
AI-generated content refers to any form of digital content, such as text, images, videos, or audio, that is created by artificial intelligence systems without direct human intervention. These AI systems use advanced algorithms and machine learning techniques to generate content that mimics human creativity and decision-making processes. …
While AI-generated content has the potential to improve efficiency and creativity in various fields, it also raises ethical concerns related to plagiarism, copyright infringement, bias, and misinformation, which need to be carefully addressed when using and sharing AI-generated materials.
Have more questions? Learn more about AI-generated content at Conductor.
Why do schools need an AI policy?
Schools need to create AI policies for the same reasons they have plagiarism policies: to help students understand what’s acceptable and what isn’t. After all, we don’t tell students they can never use someone else’s writing in their own essays. Instead, we explain that they must always recognize and properly attribute any citations they use. This helps students understand that they can’t pass someone else’s writing or ideas off as their own, but they can use them to support their own thinking.
An AI policy should do the same thing. AI isn’t necessarily the enemy—there are lots of legitimate uses for it. But if students use AI to do all their assignments, they won’t learn what they’re in school to learn. And when the time comes to demonstrate their knowledge when AI isn’t available to them (like an in-class test), they’ll likely fail.
So, a school AI policy benefits students as well as teachers. In its current form, AI technology is new to most users, and a good policy helps kids and their families know when and how to use it (and when not to use it).
Is using AI the same as plagiarism?
Some have argued that plagiarism policies are sufficient to cover AI as well. And while AI use and plagiarism have a lot of overlap, there are some important differences.
Plagiarism is copying another person’s work and passing it off as your own. This could be intentional, but it may also be accidental when writers aren’t educated on what plagiarism entails. Writers can avoid plagiarism by appropriately citing sources.
AI content is generated by a program, using sophisticated algorithms that pull from a variety of content available on the internet. Depending on the program, the content produced could be plagiarized from another source without attribution. If a writer uses this plagiarized content in their own work, they are also unknowingly plagiarizing.
Of course, the potential for accidental plagiarism isn’t the only concern about AI-generated content. But it’s important to make students aware that this is one potential issue with using AI programs.
How do schools get started with an AI policy?
Plagiarism policies have been around for a long time, but AI policies are fairly new and you might be unsure how to begin. Here are some steps you might take.
- Gather a team. Put together a team that includes at least one person with a strong understanding of what AI technology is capable of. Others on the team might include administrators, teachers, students, parents, and legal advisors.
- Determine your goals. What do you want your AI policy to include? Will it be separate from other policies, or will you include it in your existing ethics and plagiarism policies? How specific will your policy be?
- Review examples. Take a look at policies written by other schools (see below) and highlight sections you want to include in your own policy. You might even ask ChatGPT or another AI content generator to create some text for you to consider.
- Draft a policy. Put a first draft into writing, and put it through your school’s policy review process. Be sure to ask for feedback from teachers, students, and families.
- Make edits, finalize, and publish. Use the feedback you’ve gathered to make any necessary edits, and ensure your language is clear and specific. Publish it according to your school’s guidelines.
- Educate staff and students. Don’t rely on your policy alone to help everyone use AI responsibly. Spend time educating both staff and students on the benefits and risks of using AI. See example lessons at Edutopia.
What should an AI policy for schools include?
A comprehensive AI policy requires much more than just telling students “Don’t use AI to cheat.” Schools should be specific in their guidelines, helping everyone understand what is and isn’t appropriate. These are some possible sections to include in your policy:
Appropriate AI Use
There are many ways students can use AI as a tool, rather than a way to cheat. Include examples of Dos and Don’ts in your policy to help make things clear.
- Use AI programs as smart search engines that present information in ways that are easy to read and understand.
- Ask AI programs for clarification or explanations when you need help.
- Generate ideas, topics, and writing prompts using AI programs.
- Be transparent; attribute AI text and images properly when you use them in your own work.
- Use AI programs to avoid doing your own work.
- Copy text or images from AI programs without proper attribution.
- Use AI text or images without fact-checking and exploring potential plagiarism issues.
- Use AI when your teacher expressly forbids it.
Responsible AI Use
This section should lay out potential risks of using AI and what responsible use looks like. It should include safety cautions about sharing personal data with AI bots, as well as using them to to invade others’ privacy.
Your policy should remind students that AI programs can have implicit bias, and even present incorrect information. Anytime they use an AI program, they should think critically and be sure to fact-check using primary sources.
Reporting and Consequences
Use this section to encourage students to report any knowledge they have of AI misuse. Also, lay out the potential consequences if staff discovers a student misusing AI. Will you align it with your plagiarism policies? Consider it an ethics violation? Each school must decide their academic integrity policies for themselves, and AI violations should be a part of it.
Education and Awareness
Schools should commit to educating students and staff about advances in AI technology and its responsible use. Consider requiring students to participate in AI-use education at the beginning of each school year.
Your policy should also clearly state any ways in which the school itself uses AI programs, from data collection and analysis to automatically generated notifications, etc. Note the school’s commitment to using AI fairly and safely.
AI Policy Resources for Schools
There’s a lot to consider as you formulate your policy. Try these resources to help.
- Alice Keeler: Acceptable Use Policy for AI in the ELA Classroom
- Common Sense Education: How To Handle AI in Schools
- UNESCO: AI and Education Guidance for Policy Makers
- Cleveland State University AI Policy Statement Examples
- University of Missouri: ChatGPT, Artificial Intelligence, and Academic Integrity
- University of British Columbia: Chat GPT and Other Generative AI Tools
Has your school written an AI policy, or are you working on one? Come share your ideas and ask for advice in the WeAreTeachers HELPLINE group on Facebook.