Give your agents ears.
Give Claude, Cursor, and other MCP-compatible tools access to live transcripts during recording and past transcripts later.
7-day free trial - no credit card required
Built for the AI-first workflow.
Native MCP Server
No extra server to deploy. Parrot Scribe hosts its own Model Context Protocol server, giving your tools secure, local access to live transcripts during recording and past transcripts later.
Live Transcription for AI
Feed live transcription into your AI workflows while the session is still happening. Ask your LLM what was just said without waiting for the meeting to end.
Secure and Local
Communication happens over local Unix domain sockets. Your agents get the data, but your audio stays private on your Mac.
Evidence
Read the details before building a live meeting assistant around your transcript layer
These pages explain the local MCP model, the privacy boundary, and how Parrot Scribe can feed live and historical transcripts into the tools, prompts, and models you choose.
Read the source
MCP server
Setup details, token model, and how live transcript access works when you enable it.
Open page →
Read the source
Real-world operating workflow
How local capture, review, and downstream AI use fit together in practice.
Open page →
Read the source
Local architecture rationale
Why the product is shaped around on-device control instead of cloud collaboration.
Open page →
Read the source
Encryption architecture
How transcript storage and keys are protected when you use local AI workflows.
Open page →