You've heard about AI tools that can create worksheets in seconds, generate quiz questions on any topic, and write lesson plans that actually make sense. But can you just start using them?
The answer depends on two things: what the tool does with data, and what your district's policy says. Let's break it down.
The Short Answer
If the AI tool collects student data: You almost certainly need district approval.
If the AI tool is teacher-only and collects zero student data: The privacy burden is much lower — but check your district's technology policy anyway.
That distinction — whether student data enters the AI system — is the dividing line between "I need to file a request with IT" and "I can use this today."
When You Definitely Need Approval
You need to get your district's permission before using any AI tool that:
- Requires students to create accounts. Student accounts mean student data collection — names, emails, potentially usage data. FERPA applies.
- Allows students to interact directly with AI. When students type prompts into an AI tool, their inputs may contain personal information. This is data collection under both FERPA and COPPA (for students under 13).
- Processes student work. If you're uploading student essays, test answers, or writing samples to an AI tool for grading or feedback, student education records are being shared with a third party.
- Integrates with your LMS or Google Classroom. Any tool that connects to your school's learning management system can access student data through that integration.
For these tools, your district needs a Data Processing Agreement (DPA) with the vendor — a legal contract that specifies how student data will be handled, stored, and protected under FERPA.
When You Probably Don't Need Approval (But Should Still Check)
Some AI tools operate in a way that never touches student data:
- Teacher-only content generation. You enter a topic and grade level. The AI creates a worksheet, quiz, or lesson plan. You print it or share it with students. No student data enters the system.
- No student accounts or logins. Only the teacher has an account. Students interact with the output (a printed worksheet, a shared PDF), not the AI tool itself.
- No student PII in prompts. You're asking the AI to "create a 5th grade math worksheet on fractions" — not "analyze Johnny's test scores."
In these cases, the AI tool is functionally equivalent to a teacher using a search engine to find worksheet templates. The generated materials are just documents.
That said, many districts have blanket policies about AI tools — even ones that don't collect student data. Always check your district's acceptable use policy before adopting any new technology.
The Three-Category Framework
Here's a simple way to think about AI tools and approval requirements:
| Category | Examples | Approval Needed? |
|---|---|---|
| Student-facing AI | AI chatbots for students, student-facing tutoring tools, AI writing assistants that students use directly | Yes — always. DPA required. |
| Teacher tools that can process student data | AI grading tools, student work analyzers, tools where teachers paste student writing | Yes — if student data enters the system |
| Teacher-only, zero student data | AI worksheet generators, lesson plan creators, quiz builders where only the teacher provides input | Usually not — but check district policy |
What About ChatGPT?
Start using AI right now — no approval needed
Try TeachTools Free →ChatGPT is a general-purpose AI tool, not education-specific. This creates specific considerations:
- Age requirement: ChatGPT requires users to be 13+ (18+ without parental consent in some regions). Students under 13 cannot use it without violating OpenAI's terms.
- Data retention: By default, ChatGPT conversations may be used for model training. Even paid plans require specific opt-out settings.
- No DPA framework: ChatGPT is not designed for school procurement the way education-specific tools are.
- Many districts have banned it: Some districts have explicit policies prohibiting ChatGPT use on school networks.
If you want to use ChatGPT personally to prepare lesson materials at home, that's generally your prerogative — but you should never enter student PII into any consumer AI tool.
How to Get Approval When You Need It
If you've found an AI tool you want to use and it requires district approval:
- Start with your technology coordinator or IT department. They typically manage the tool approval process.
- Gather the vendor's privacy documentation. Privacy policy, terms of service, and any DPA or student privacy pledge.
- Identify the specific features you want to use. Some tools have teacher-only features that may not require the same level of review.
- Reference your state's student privacy requirements. Many states have their own laws beyond FERPA (California's SOPIPA, New York's Ed Law 2-d, Illinois' SOPPA).
- Be prepared to wait. District technology review processes can take weeks or months. For immediate needs, consider tools that don't require approval.
The Fastest Path: Tools That Don't Need Approval
If you want to start using AI in your classroom today — without waiting for a district review process — the fastest path is a teacher-only tool that collects zero student data.
TeachTools was built for exactly this use case. Teachers generate worksheets, quizzes, lesson plans, and rubrics. Students receive the finished materials. No student accounts, no student data, no FERPA triggers.
It's the AI equivalent of a teacher using a word processor to create a handout. The tool is a productivity aid for the teacher — not a platform students interact with.
The Bottom Line
The question isn't really "can I use AI tools?" — it's "does this tool collect student data?"
If it does, you need approval. If it doesn't, you have much more flexibility — though you should still check your district's policy.
The best way to use AI in your classroom without getting tangled in approvals: choose tools that never touch student data in the first place.
Related reading: