Implementation guide
How to Write an AI Acceptable Use Policy for Your School
Step-by-step guide to writing an AI acceptable use policy for your school or district — who to involve, what to include, common pitfalls, and how to get it adopted.
Primary question
What should an AI acceptable-use policy for schools include?
Writing an AI acceptable use policy for your school means: (1) forming a cross-functional team, (2) starting from a template you adapt locally, (3) defining approved and prohibited uses clearly, (4) addressing privacy, academic integrity, and equity, and (5) building in a review cycle. Expect 4–8 weeks from kickoff to board-ready draft.
Last updated
March 5, 2026
Content and metadata refreshed on the date shown.
Evidence level
document reviewed
Signals are labeled so educators can separate vendor claims from reviewed documentation.
Sources checked
3
Each page lists the public materials used to support its claims.
Last verified
March 5, 2026
Useful for policy, pricing, and compliance signals that can shift over time.
Jurisdiction note
Privacy, procurement, accessibility, and child-safety requirements vary by country, state, and institution. Treat U.S. FERPA/COPPA references as directional signals, not universal approval.
Quick answer
Writing an AI acceptable use policy for your school means: (1) forming a cross-functional team, (2) starting from a template you adapt locally, (3) defining approved and prohibited uses clearly, (4) addressing privacy, academic integrity, and equity, and (5) building in a review cycle. Expect 4–8 weeks from kickoff to board-ready draft.
Why schools need a process, not just a document
Many schools rush to adopt a policy without a clear process. The result is often a document that sits unused because it was written in isolation, does not match local reality, or was never socialized with the people who will enforce it.
This guide walks you through how to write an AI policy that has a real chance of being adopted and followed.
Step 1: Form the right team
An AI policy touches instruction, technology, privacy, communications, and leadership. Involve people who own each area:
- Instruction or curriculum lead — what is pedagogically acceptable
- Technology or IT director — what is technically feasible and supportable
- Privacy or compliance lead — FERPA, COPPA, state requirements
- Communications — family and staff messaging
- Principal or cabinet representative — authority to move decisions forward
A small core team (3–5 people) can draft; a slightly larger group can review and pressure-test. Avoid writing the policy in a silo.
Step 2: Start from a template
Do not start from a blank page. Use a structured template that covers the sections schools typically need:
- Purpose and scope
- Definitions (AI, generative AI, etc.)
- Approved uses
- Prohibited uses
- Student privacy (FERPA/COPPA)
- Academic integrity
- Equity and access
- Staff training expectations
- Approved tool list
- Incident response
- Review cycle
The Free AI Policy Template for Schools provides this structure. Download it, then adapt each section to your context.
Step 3: Gather input before you write
Before drafting, gather input from:
- Teachers — What are they already using? What do they need clarity on?
- Students — What do they think is fair? (Age-appropriate forums)
- Families — What questions do they have? What would make them more comfortable?
- Legal or policy counsel — State requirements, board policy alignment
Use surveys, focus groups, or listening sessions. The goal is to avoid writing a policy that ignores how AI is actually being used or what stakeholders care about.
Step 4: Define approved and prohibited uses clearly
Vague language leads to inconsistent enforcement. Be specific.
Approved uses — Examples that work well:
- Lesson planning and resource generation
- Rubric and feedback drafting (with teacher review)
- Differentiation and scaffolding support
- Administrative and communication tasks
- Accessibility and accommodation support
Prohibited uses — Examples that work well:
- Submitting AI-generated work as original student work without disclosure
- Using AI to circumvent assessment or evaluation
- Entering student PII into non-approved tools
- Using AI for decisions that require human judgment (e.g., discipline, placement)
Adapt these to your context. The key is that teachers, students, and families can point to the policy and know what is allowed.
Step 5: Address privacy, integrity, and equity
Every AI policy should explicitly cover:
Privacy (FERPA/COPPA)
- Which tools are approved for student data?
- What data can and cannot be entered into AI tools?
- How does the school vet tools? (Use the FERPA Compliance Checklist)
Academic integrity
- When must students disclose AI use?
- What counts as appropriate vs inappropriate AI use in assignments?
- How will the school respond to violations?
Equity and access
- How will the school ensure AI does not widen the digital divide?
- What supports exist for students without home access?
- How will the school communicate with families? (Use the Parent Communication Checklist)
Step 6: Build in staff training and tool governance
A policy without training is unlikely to stick. Include:
- Minimum professional development expectations
- Who is responsible for training
- How new tools get added to the approved list
- How often the approved list is reviewed
Many schools tie tool approval to the FERPA Compliance Checklist and require a designated owner for the tool registry.
Step 7: Plan for incidents
Define what happens when something goes wrong:
- Who is the first point of contact?
- What is the escalation path?
- How are families notified when student data or integrity is involved?
- How is the incident documented for policy improvement?
Incident response sections often get overlooked until a crisis. Writing it in advance reduces improvisation when it matters most.
Step 8: Set a review cycle
AI changes fast. A policy written in 2026 may need updates in 2027. Include:
- Annual review (minimum)
- Designated owner (e.g., technology or curriculum lead)
- Process for proposing changes
- Board or policy committee alignment
Common pitfalls to avoid
- Writing in isolation — Policies written by one person without input rarely get adopted.
- Copy-pasting without adaptation — Templates are starting points. Local context matters.
- Skipping legal review — State and local requirements vary. Get counsel involved.
- Ignoring current use — If teachers are already using AI, the policy must address that reality.
- No communication plan — Roll out the policy with clear messaging to staff, students, and families.
Timeline: what to expect
| Phase | Duration | Activities |
|---|---|---|
| Team formation | 1–2 weeks | Identify owners, schedule kickoff |
| Input gathering | 2–3 weeks | Surveys, focus groups, legal check |
| Drafting | 2–3 weeks | Adapt template, internal review |
| Stakeholder review | 1–2 weeks | Cabinet, teacher reps, communications |
| Board preparation | 1–2 weeks | Final edits, board packet |
Total: roughly 4–8 weeks from kickoff to board-ready draft, depending on district size and process.
What to do next
- Download the Free AI Policy Template
- Form your cross-functional team
- Run existing tools through the FERPA Compliance Checklist
- Read What Is AI in Education? for broader context
- Visit the Resources hub for related free material
Next steps
Use this guide inside a broader decision flow.
Policy resource
AI Academic Integrity Policy Template for Schools and Universities
Policy resource
Free AI Policy Template for Schools
Comparison
Khanmigo vs ChatGPT for Schools
Comparison
Best AI Tools for Schools in 2026 — Independent Comparison
Tool review
MagicSchool AI Review (2026)
Tool review
Microsoft Copilot for Education
Tool review
Curipod Review (2026)
Sources
Sources used for this guide
Guidance | Protecting Student Privacy
Official federal guidance documents and technical assistance materials for FERPA-related privacy review.
Accessed Mar 5, 2026
Children's Online Privacy Protection Act
Statutory COPPA reference used for parent rights, consent, and child-data protections.
Accessed Mar 5, 2026
Guidance for generative AI in education and research
Global guidance on human-centred AI adoption, policy design, and education-specific risks.
Published Sep 6, 2023 · Accessed Mar 5, 2026