A county clerk's office in Florida processes approximately 80 FOIA (Freedom of Information Act) requests per month. Most requests are routine: police reports, building permits, meeting minutes, budget documents. A few are complex: all emails between two departments during a specific period, or all contracts with a specific vendor over five years.
The routine requests took 3 to 5 days. The complex requests took 15 to 30 days. The average across all requests was 22 days, and the state reporting threshold that triggers scrutiny is 30 days. They were consistently close to the line.
The bottleneck was not finding documents. Their document management system could retrieve responsive documents in minutes. The bottleneck was redaction review.
The redaction problem
Before releasing any document in response to a FOIA request, someone must review every page for content that is exempt from disclosure. Social Security numbers. Medical information. Personnel file content. Attorney-client privileged communication. Ongoing investigation details. Juvenile records.
A routine request might involve 20 pages. A complex request could involve 500 pages or more. Each page needs a human review to identify exempt content, mark it for redaction, and document the exemption category used.
The county had one full-time records analyst doing this work. She was good at it, thorough and consistent, but there are only so many pages one person can review in a day. At her pace of approximately 40 pages per hour, a 500-page request took 12.5 hours of review time. With 80 requests per month and only one reviewer, the backlog was structural.
What the tool does
The tool pre-screens documents for potential redaction-required content. It reads each page and flags content that matches categories of exempt information:
Patterns: Social Security numbers (XXX-XX-XXXX format), dates of birth, phone numbers, email addresses, account numbers. These are structural patterns the tool catches with near-perfect accuracy.
Contextual flags: References to medical conditions, personnel actions (termination, discipline, performance review), attorney-client communication markers ("privileged and confidential," "attorney work product"), and investigation status language. These require the tool to understand context, not just patterns.
The tool marks each flag with a confidence level and the exemption category it believes applies. High-confidence flags (SSN patterns, explicit privilege markers) are highlighted in red. Medium-confidence flags (contextual references that might be exempt) are highlighted in yellow. The reviewer focuses attention on yellow flags and spot-checks red flags.
What the tool does NOT do
It does not redact. It does not make disclosure decisions. It does not determine whether content is actually exempt — it identifies content that might be exempt and presents it to the human reviewer for a decision.
This distinction was non-negotiable with the county attorney's office. The legal liability for improper redaction rests with the county. A tool that makes redaction decisions creates unacceptable risk. A tool that identifies candidates for redaction and requires human approval creates efficiency without additional liability.
The CLAUDE.md governing this tool includes: "Never redact or recommend redaction. Flag and categorize potential exempt content. All redaction decisions require human authorization. Log every flag, every decision, and the identity of the authorizing reviewer."
The impact on response time
The records analyst went from reviewing 40 pages per hour to effectively reviewing 120 pages per hour. She was not reading faster — she was reading with a map that told her where to focus.
For routine requests (20 pages), the review went from 30 minutes to 10 minutes. For complex requests (500 pages), the review went from 12.5 hours to approximately 4 hours. The analyst reported that the AI pre-screen caught everything she would have caught and flagged several items she might have missed in a long review session.
Average response time dropped from 22 days to 4 days within the first two months. Complex requests dropped from 15-30 days to 5-8 days.
The accuracy question
The county attorney required a parallel review for the first 30 days. The analyst reviewed 200 documents using both the AI-assisted process and her traditional process, comparing results.
The AI pre-screen flagged 100% of pattern-based exempt content (SSNs, DOBs, account numbers). For contextual flags, it identified 94% of content the analyst would have flagged, and it generated 12% false positives — content it flagged that the analyst determined was not actually exempt.
The false positive rate is acceptable because it costs the reviewer a few seconds to dismiss a flag. The 6% that the tool missed were edge cases involving unusual phrasing of privileged communications. The analyst added those phrasings to the tool's pattern library, which improved accuracy in subsequent months.
The cost
The build took five days. The tool runs on county infrastructure. No per-page licensing. No vendor contract. No ongoing subscription.
The county attorney estimated that hiring a second records analyst would cost $52,000 per year including benefits. The tool cost less than one month of that salary to build and effectively tripled the existing analyst's throughput.
The public trust element
Faster FOIA responses improve public trust. When a journalist or citizen requests records and receives them in 4 days instead of 22, the perception shifts from "the government is stalling" to "the government is responsive."
The county clerk told me that complaint calls about records delays dropped to near zero after the first month. That is not a metric most technology projects track, but for a county government, it matters as much as any efficiency number.
The audit trail
Every FOIA request, every document reviewed, every flag generated, every redaction decision, and every release is logged with timestamps and the reviewer's identity. The county can produce a complete history of how any request was handled, what was flagged, what was redacted, and who authorized the release.
This audit trail existed before the tool — it was just maintained manually in a spreadsheet. Now it is automatic, complete, and searchable. When the state auditor reviews FOIA compliance, the county can produce records instantly instead of compiling them from multiple sources.
What this means for your agency
If your office processes public records requests and the bottleneck is review rather than retrieval, you have the same opportunity. The document management system you already use contains the documents. The exemption categories are defined by statute. The review process follows a consistent pattern that can be accelerated with pre-screening.
The tool does not replace the reviewer. It makes the reviewer fast enough to handle the volume without additional headcount. In a government environment where new positions require budget approval and hiring takes months, that is the difference between meeting your response deadlines and explaining to the state why you are not.
Get posts like this in your inbox
No spam. New articles on AI strategy, governance, and building with AI for small business.