Brisk Teaching is an AI-powered Chrome extension that integrates with Google Workspace — Docs, Slides, and Classroom. It helps teachers create lesson plans, generate assessments, and provide AI-powered feedback on student writing.
If your school uses Google Workspace and you're considering Brisk, the FERPA question is worth examining carefully — because Chrome extensions that touch Google Docs raise specific privacy considerations.
Brisk Teaching's Privacy Stance
Brisk Teaching states its commitment to student data privacy and FERPA compliance. As an education-focused tool, they are aware of the regulatory requirements around student data in K-12 environments.
Key points from their public documentation:
- They offer Data Processing Agreements (DPAs) for school districts
- They state they do not sell student data
- They operate within the Google Workspace ecosystem, leveraging Google's existing security infrastructure
The Chrome Extension Data Model
What makes Brisk Teaching unique — and what requires careful FERPA evaluation — is its Chrome extension architecture. Unlike standalone web apps, Brisk operates inside your browser, with access to content in Google Docs, Slides, and other Workspace tools.
This creates two distinct use cases with very different privacy implications:
Use Case 1: Teacher-Only Content Creation
When a teacher uses Brisk to generate a lesson plan, create a rubric, or draft assessment questions in their own Google Doc:
- Only the teacher's content is processed
- No student data enters the AI pipeline
- FERPA risk is minimal
Use Case 2: AI Feedback on Student Work
When a teacher uses Brisk's feedback features on a Google Doc containing student writing:
- The student's text is sent to the AI for analysis
- This text may contain the student's name, writing style, opinions, and potentially sensitive content
- The AI processes this data to generate feedback
- This triggers FERPA requirements because student work is being shared with a third-party service
The FERPA implications depend heavily on which features you use and whether the documents you're working with contain student PII.
Key Questions Before Using Brisk Teaching
| Question | Why It Matters |
|---|---|
| Will you use it on documents containing student work? | Student writing sent to AI constitutes sharing student records with a third party |
| What Chrome permissions does the extension request? | Extensions with broad permissions can access more data than you realize |
| Does your district have a signed DPA with Brisk? | Without a DPA, there's no contractual guarantee of FERPA-compliant data handling |
| Where does the AI processing happen? | Data sent to third-party AI providers introduces additional privacy considerations |
| Can you limit the extension to teacher-only documents? | Restricting use to non-student documents minimizes FERPA exposure |
Brisk Teaching vs. TeachTools: Privacy Comparison
AI worksheets with zero privacy overhead
Generate a Worksheet Free →| Brisk Teaching | TeachTools | |
|---|---|---|
| How it works | Chrome extension inside Google Workspace | Standalone web app — teacher-only interaction |
| Can access student work? | Yes — through Google Docs integration | No — students never interact with the platform |
| Student data processed? | Possible, when grading/feedback features are used on student docs | Never — zero student data by design |
| DPA required? | Yes, if student data is processed | Available, but not required — nothing to process |
| FERPA approach | Compliance through policies and agreements | Compliance by architecture — no student data exists |
Brisk Teaching is a powerful tool, especially if your school is deeply invested in the Google Workspace ecosystem. But its Chrome extension model means it can see student work — which creates privacy obligations that need to be managed.
TeachTools takes a different approach: teachers use the AI to generate materials, and students receive the finished product. No Chrome extension, no document access, no student data in the pipeline.
The Bottom Line
Brisk Teaching can be used in a FERPA-compliant way, but it requires intentional configuration and a signed DPA. The safest approach is to limit AI features to teacher-created documents and avoid running AI analysis on student work without district approval.
If your goal is zero privacy overhead — no DPAs to manage, no extension permissions to audit, no student data in the AI pipeline — a teacher-only tool eliminates the entire category of risk.
The best privacy architecture isn't one that protects student data carefully. It's one where student data never enters the system in the first place.
Related reading: