Anything you say to an AI notetaker can and will be used against you
If you're using Otter or Fathom, you need to read this
Finally, your employer has subscribed to an AI notetaker.
Great news, right? You don’t have to scribble action points anymore. You don’t have to spend 20 minutes after the call writing up a summary. Woohoo.
You’ve saved 20 minutes.
Except they’re not really your 20 minutes. They belong to your employer.
And now your employer has gained something far more valuable than your extra time. They’ve gained a complete, permanent record of everything you said. Every hesitation. Every frustration. Every half-baked idea. Every client name. Every number you let slip. Your thought process. The way you phrased things. How the other person reacted.
And the AI notetaker? If it’s Otter* or Fathom**, both of which which admit to training on your data (although Fathom claims you can opt out) then congratulations. You’ve just handed them gold dust. Because you didn’t just give them notes. You gave them your ideas, your thought process, your voice, your phrasing, the way you interact with people. All of it. To them, that’s priceless. That’s what turns founders, shareholders, and investors into billionaires.
Fireflies by the way takes a different stance: it promises not to train on your meeting data. Which proves the point: if one company can refuse, then the others are making a choice.
What’s really happening to that call recording
Otter and Fathom admit in their terms that they use your meeting data to train their AI. They say it’s safe because they de-identify or remove your identity. But what does that actually mean?
My interpretation is that it doesn’t mean your words vanish. It doesn’t mean your ideas are erased. It means your name and your email address are stripped off the file. That’s it.
Everything that happened in the meeting stays. The client you mentioned by name. The competitor you criticised. The numbers you revealed. The personal details you let slip. The sarcastic joke you made. The brilliant idea you shared. The feedback praising that idea. The way someone else referred to you. The casual details about your life you dropped without thinking. All of it is still there - just without your identifying details attached.
Do you really want to give this information to an AI model?
Hey Reader. Did you know about my AI coaching programme (The AI Edit)? Become a confident, purposeful AI user in 3 hours a month.
Your data is valuable - and not just to you
I’m starting to wonder whether the real value of you to these AI notetaker companies isn’t your subscription fee (and some of them are free to use anyway). It’s the mountain of private conversations they are gathering from you.
Large language models like ChatGPT are already trained on the entire internet. But what they can’t reach are the conversations that happen behind closed doors: board meetings, sales calls, negotiations, job interviews, team brainstorms. That’s the missing piece. And that data is worth potentially billions to them.
If your AI notetaker has recorded all your conversations, and then get acquired by a bigger AI player, all of that meeting content could suddenly be used to train the next generation of AI models. Imagine ChatGPT-7 or ChatGPT-8 trained not just on Reddit and Wikipedia and YouTube, but on thousands of private business conversations - including yours. Does the fact that they are de-identified make this any better?
Imagine your competitor using a model that knows exactly how people in your industry and in your company pitch, negotiate, and sell. A model that knows what really works. And what doesn’t. And all of that becomes available to them, built on the back of your words.
There are no safe spaces anymore
And if you’re not worried about what the companies behind these note takers would do with your data, have a think about how your employer is using it. Once every meeting is searchable, management doesn’t need to remember what you said, they can simply pull it up. Every hesitation, every disagreement, every vent is now documented and reviewable.
You might think, my employer would never do that. But what if someone buys your company? Then your boss isn’t in control anymore. Do you trust the future owner?
And does your company store this data securely? How confident are you that it won’t be leaked or hacked?
So what started as saving you 20 minutes (which wasn’t really saving you 20 minutes, it was saving 20 minutes of your time for your employer to use elsewhere) has ended up with you handing over the raw material of your working life: your words, your ideas, your reactions, your thought processes, your intonation, sometimes your voice, sometimes your face, your phraseology. All neatly packaged, stored, and ready to be used in ways you never agreed to.
You don’t have to agree to this
Employee, solopreneur, random person who finds him/herself on a call with Otter, Fireflies, Fathom etc., let me tell you something. You can refuse.
You can refuse to be recorded. You can ask your employer what their lawful basis is for processing your data this way. You can point out that efficiency isn’t enough. You can absolutely say no.
You have the right not to be turned into training data. You have the right not to have your words and your face turned into a permanent dataset.
Your employer might back down. Or they might claim they have a “legitimate interest” in recording. They might say it reduces admin burden, improves accuracy, ensures compliance (do you want to work with an employer who has to ensure compliance that way?). They’ll say it helps people who missed the meeting. They’ll point to security policies and claim the data is safe. It might do all those things, but ask them to confirm in writing how this complies with the Information Commission’s official guidelines.
Sign up for my live session: Become a ChatGPT Super User (25 September)
*From the OtterAI website: Otter uses a proprietary method to de-identify user data before training our models so that an individual user cannot be identified. This training method is automatic and as such audio recordings and transcripts are not manually reviewed by a human. Additionally our training data is encrypted.
**From the Fathom Trust Center: Fathom uses de-identified customer data to improve the accuracy of our proprietary AI models in order to improve our service for all users. You can opt out of this anytime in your user settings.