Independent AI research for educators worldwide

Tools Compare Policies & Frameworks Guides Resources Search

How to Write an AI Acceptable Use Policy for Your School

Step-by-step guide to writing an AI acceptable use policy for your school or district — who to involve, what to include, common pitfalls, and how to get it adopted.

Policy & Governance 12 min read

What should an AI acceptable-use policy for schools include?

Writing an AI acceptable use policy for your school means: (1) forming a cross-functional team, (2) starting from a template you adapt locally, (3) defining approved and prohibited uses clearly, (4) addressing privacy, academic integrity, and equity, and (5) building in a review cycle. Expect 4–8 weeks from kickoff to board-ready draft.

Author

Qaisar Roonjha

Founding Editor

Last updated

March 5, 2026

Content and metadata refreshed on the date shown.

Evidence level

document reviewed

Signals are labeled so educators can separate vendor claims from reviewed documentation.

Sources checked

3

Each page lists the public materials used to support its claims.

Last verified

March 5, 2026

Useful for policy, pricing, and compliance signals that can shift over time.

Privacy, procurement, accessibility, and child-safety requirements vary by country, state, and institution. Treat U.S. FERPA/COPPA references as directional signals, not universal approval.

Quick answer

Writing an AI acceptable use policy for your school means: (1) forming a cross-functional team, (2) starting from a template you adapt locally, (3) defining approved and prohibited uses clearly, (4) addressing privacy, academic integrity, and equity, and (5) building in a review cycle. Expect 4–8 weeks from kickoff to board-ready draft.

Why schools need a process, not just a document

Many schools rush to adopt a policy without a clear process. The result is often a document that sits unused because it was written in isolation, does not match local reality, or was never socialized with the people who will enforce it.

This guide walks you through how to write an AI policy that has a real chance of being adopted and followed.

Step 1: Form the right team

An AI policy touches instruction, technology, privacy, communications, and leadership. Involve people who own each area:

  • Instruction or curriculum lead — what is pedagogically acceptable
  • Technology or IT director — what is technically feasible and supportable
  • Privacy or compliance lead — FERPA, COPPA, state requirements
  • Communications — family and staff messaging
  • Principal or cabinet representative — authority to move decisions forward

A small core team (3–5 people) can draft; a slightly larger group can review and pressure-test. Avoid writing the policy in a silo.

Step 2: Start from a template

Do not start from a blank page. Use a structured template that covers the sections schools typically need:

  • Purpose and scope
  • Definitions (AI, generative AI, etc.)
  • Approved uses
  • Prohibited uses
  • Student privacy (FERPA/COPPA)
  • Academic integrity
  • Equity and access
  • Staff training expectations
  • Approved tool list
  • Incident response
  • Review cycle

The Free AI Policy Template for Schools provides this structure. Download it, then adapt each section to your context.

Step 3: Gather input before you write

Before drafting, gather input from:

  • Teachers — What are they already using? What do they need clarity on?
  • Students — What do they think is fair? (Age-appropriate forums)
  • Families — What questions do they have? What would make them more comfortable?
  • Legal or policy counsel — State requirements, board policy alignment

Use surveys, focus groups, or listening sessions. The goal is to avoid writing a policy that ignores how AI is actually being used or what stakeholders care about.

Step 4: Define approved and prohibited uses clearly

Vague language leads to inconsistent enforcement. Be specific.

Approved uses — Examples that work well:

  • Lesson planning and resource generation
  • Rubric and feedback drafting (with teacher review)
  • Differentiation and scaffolding support
  • Administrative and communication tasks
  • Accessibility and accommodation support

Prohibited uses — Examples that work well:

  • Submitting AI-generated work as original student work without disclosure
  • Using AI to circumvent assessment or evaluation
  • Entering student PII into non-approved tools
  • Using AI for decisions that require human judgment (e.g., discipline, placement)

Adapt these to your context. The key is that teachers, students, and families can point to the policy and know what is allowed.

Step 5: Address privacy, integrity, and equity

Every AI policy should explicitly cover:

Privacy (FERPA/COPPA)

  • Which tools are approved for student data?
  • What data can and cannot be entered into AI tools?
  • How does the school vet tools? (Use the FERPA Compliance Checklist)

Academic integrity

  • When must students disclose AI use?
  • What counts as appropriate vs inappropriate AI use in assignments?
  • How will the school respond to violations?

Equity and access

  • How will the school ensure AI does not widen the digital divide?
  • What supports exist for students without home access?
  • How will the school communicate with families? (Use the Parent Communication Checklist)

Step 6: Build in staff training and tool governance

A policy without training is unlikely to stick. Include:

  • Minimum professional development expectations
  • Who is responsible for training
  • How new tools get added to the approved list
  • How often the approved list is reviewed

Many schools tie tool approval to the FERPA Compliance Checklist and require a designated owner for the tool registry.

Step 7: Plan for incidents

Define what happens when something goes wrong:

  • Who is the first point of contact?
  • What is the escalation path?
  • How are families notified when student data or integrity is involved?
  • How is the incident documented for policy improvement?

Incident response sections often get overlooked until a crisis. Writing it in advance reduces improvisation when it matters most.

Step 8: Set a review cycle

AI changes fast. A policy written in 2026 may need updates in 2027. Include:

  • Annual review (minimum)
  • Designated owner (e.g., technology or curriculum lead)
  • Process for proposing changes
  • Board or policy committee alignment

Common pitfalls to avoid

  1. Writing in isolation — Policies written by one person without input rarely get adopted.
  2. Copy-pasting without adaptation — Templates are starting points. Local context matters.
  3. Skipping legal review — State and local requirements vary. Get counsel involved.
  4. Ignoring current use — If teachers are already using AI, the policy must address that reality.
  5. No communication plan — Roll out the policy with clear messaging to staff, students, and families.

Timeline: what to expect

PhaseDurationActivities
Team formation1–2 weeksIdentify owners, schedule kickoff
Input gathering2–3 weeksSurveys, focus groups, legal check
Drafting2–3 weeksAdapt template, internal review
Stakeholder review1–2 weeksCabinet, teacher reps, communications
Board preparation1–2 weeksFinal edits, board packet

Total: roughly 4–8 weeks from kickoff to board-ready draft, depending on district size and process.

What to do next

  1. Download the Free AI Policy Template
  2. Form your cross-functional team
  3. Run existing tools through the FERPA Compliance Checklist
  4. Read What Is AI in Education? for broader context
  5. Visit the Resources hub for related free material

Use this guide inside a broader decision flow.

Sources used for this guide

guidance U.S. Department of Education

Guidance | Protecting Student Privacy

Official federal guidance documents and technical assistance materials for FERPA-related privacy review.

Accessed Mar 5, 2026

Double opt-in Unsubscribe anytime View newsletter archive Privacy