What Risks Do Note-Taking Technologies Pose To Your Company?

June 27, 2025

Note-taking and transcription technologies are a time saver, and they’re getting pretty accurate too. It’s no wonder we’re seeing them pop up more and more often. The allure of automated note taking is strong – ensuring the benefit of ideas and to-dos are captured in real-time, instead of discussed and promptly forgotten.   

However, there are some very real business and legal risks that transcription and note-taking technologies pose to your company. We walk through them in this article so you can make an informed decision about the tech you implement and take steps to mitigate the risks. 

What Are AI Note Takers?  

Before we dig into the risks, let’s quickly cover the types of technologies we’re talking about here.  

AI Note-takers are often either standalone software applications or a feature within a larger platform that use artificial intelligence to automate the note-taking process. The software usually relies on speech-to-text transcription and natural language processing to function.   

 Usually, these applications organize information from meetings, lectures, or interviews, often identifying speakers, extracting key information, and summarizing content, though some offer complete transcripts of meetings. The software will typically appear as an “additional attendee” in Zoom, Google Meet, and Microsoft Teams meetings, if used.  

These technologies are gaining in popularity because they automate the note-taking process, increase accessibility by making written records more common, and enhancing productivity.  

Business Risks of AI Note-Taking Software 

Unfortunately, implementing AI note-taking software isn’t without risk. On the business operations side, the risks range from mildly annoying to potentially alienating your team. So they’re worth being aware of.  

Hallucinations 

Generative AI is known to ‘hallucinate’ from time-to-time. A hallucination occurs where the AI generates content that is plausible but entirely fabricated or inaccurate.  

In the context of meeting notes, this could manifest as misattributed statements, ‘ghost’ participants (attendees cited who were not there), incorrect figures, or made-up points of discussion. While often subtle, such inaccuracies can have far-reaching implications, leading to poor decision making, miscommunications, and potentially costly rework. 

Quality of Notes 

The quality of the note taking can vary widely depending on the platform chosen, as well as participant characteristics. Things like accents, differences in pitch or voice tenor, speech impediments, or other subtle differences in the way we speak can lead to decreases or at least variations in quality of the outputs from the note taker.  

Attendee Discomfort 

There’s also a very clear human element at play. The human participants in a meeting may feel hesitant to participate fully in a meeting if there’s a transcription AI present – particularly if no training was provided explaining the technology.  

Any discomfort felt by your team will likely be amplified if the quality of the AI note taker is variable or lower quality.  

Bias  

Finally, bias is a very real risk when adopting any automated application. In the context of note taking, the application may exhibit deference to those it considers to be more senior, potentially based on pronunciation or use of certain words. Much like the risk we see when hiring decisions are made by certain AI platforms 

Legal Risks 

Beyond the operational pitfalls, there are legal risks that could come with significant consequences – like data breaches, litigation risk, and a lack of ownership over your own information and notes.  

Extremely broad permissions 

The permissions that applications sometimes request should give you pause – and AI note takers are no exception. We’ve seen AI note takers that request access to view, edit, and even write on other applications like your contacts, calendar, Google Suite, Zoom, and more. While these may not seem surprising – and these permissions may even save you time – they are a significant security risk. More on that in the next section.  

Third-party data breaches 

It’s important to remember that when you use an AI note-taker, you are likely going to trust it with personal and sensitive information, alongside trade secrets and confidential information. Your discussions that the AI listens in on may include things like proprietary strategies, unreleased product plans, and more of the secret sauce that makes your company valuable.  

By using the AI note taker, you are sharing this information with a third-party platform – which comes with significant risks. In the event of a data breach or insider threat, your company’s information could be exposed and/or sold, leading to reputational damage, regulatory action, contractual liabilities, and the risk that competitors could capitalize on your confidential business strategies, trade secrets, or client data.  

Licensing risks and intellectual property ownership 

Whichever AI note taker you select, you’ll want to read the terms of service to determine who owns the AI-generated outputs and whether the owner of the AI note taker grants itself any license to use the information shared with it.  

Data collection and handling 

In a similar vein, you’ll also want to confirm how the owner of the AI note-taker collects, stores, and uses the data you share with it. Particularly, you’ll want to determine whether the agreements grant the AI owner the rights to use your information for training purposes – or to sell it to data brokers or other third parties.  

These terms are crucial and should not be overlooked during your procurement process for the AI note taker.  

Discovery obligations 

There is a risk that the notes generated by AI note takers could lead to an increase in the amount of information that is discoverable by third parties in the event of litigation against your company.  

Privilege concerns 

Legal privilege is a fundamental legal principle that protects certain confidential communications from disclosure in legal proceedings (without consent). The key is that the communications need to be confidential and intended to remain confidential. AI note taking apps can pose a risk to this.  

For example, if AI note-taker settings permit meeting minutes or summaries to be distributed automatically to attendees, legal advice intended to be privileged could be circulated without appropriate vetting, notification, or consent from legal counsel – in other words, the automated process could waive privilege and you’d potentially need to disclose that information in the course of litigation.  

Best Practices For Businesses Using AI Note-Taking Apps 

The risk outline above isn’t intended to persuade you not to use AI note-taking apps. Instead, we want you to understand the risks and mitigate those risks where it’s necessary or desirable to do so.  

Here are some best practices that can help you avoid some of the business and legal risks and pitfalls: 

Implement an AI policy to reflect your stance 

An AI policy is a good starting point for building better practices among your wider teams. It should be relatively easy to understand, easy to access, and clear on your company’s position on generative AIs.  

Individual team members should not be left to decide which tools to use on their own. Instead, you should establish a formal procurement process, and your AI policy should clearly list approved tools to ensure consistency, security, and regulatory compliance.  

If you decide to allow AI note-takers, your policy should include:  

  • Acceptable use cases
  • Data-handling guidelines
  • Requirements for obtaining consents from attendees
  • Protocols (including a timeline) for reviewing and storing AI-generated content.  

Human-in-the-Loop reviews 

Building on the point above, it’s important to have a human involved in AI automations in your business – especially when it comes to generative AI. Ideally, you should create a policy where a human needs to review the AI-generated notes within a reasonable timeframe after the meeting (not longer than a few days). This practice makes it more likely that hallucinated content will be caught, and any information that’s missed is captured.  

Understand the data flows from your AI provider 

This really is a critical step. The reality is that you have no idea what risk you are exposed to if you don’t understand how the owner of your AI note taker uses your data. At a minimum, you should know where the data is stored, what security measures are in place to keep it safe (including access controls), who has access to it and for what purposes, and whether the data is shared or sold.  

We’d suggest prioritizing vendors that are transparent about data flows and, ideally, offer local storage or restricted use for paid versions. Requesting SOC 2 reports or other security certifications from any AI vendors is also a good practice. 

Consider having a waiting room ‘as standard’ for your meetings 

If you participate in meetings where commercial or personal information may be shared, it’s a good practice to configure your meeting platforms to have a waiting room by default. This is because many note-taking apps would need to be ‘admitted’ to the meeting to take notes. It encourages the practice of thinking about whether the AI note taker is needed for each meeting. It’s also less likely to just become a ‘background participant’ that ends up overhearing things it potentially should not record.  

Train your team on the risks of AI note takers 

If you adopt an AI note-taking application, we’d suggest training your team members on the potential risks and providing guidance on the appropriate use cases. For those team members who have meetings with external stakeholders, they should know enough about the application to be able to answer questions about data handling.  

If your company uses AI tools and is unsure of the risks associated with them, reach out. Our privacy counsel is available to assist.  

Disclaimer

The materials available at this website are for informational purposes only and not for the purpose of providing legal advice. You should contact your attorney to obtain advice with respect to any particular issue or problem. Use of and access to this website or any of the e-mail links contained within the site do not create an attorney-client relationship between CGL and the user or browser. The opinions expressed at or through this site are the opinions of the individual author and may not reflect the opinions of the firm or any individual attorney.

Other Articles

External Privacy Policy with hand hovering above it and reading glasses sitting on it Is an External Privacy Policy Enough?
GDPR Explained: A Quick Guide for U.S. Businesses
Children’s Data Privacy: Five Takeaways from the FTC’s Recent Workshop

    Ready to Talk?
    Contact Us

    We would to hear from you

    Please take a moment to tell us a few things about your needs and someone from our team will reach out to you as soon as possible.

    We would to hear from you

    Thank you for reaching out!

    Someone from our team will get back to you shortly

    We would to hear from you