
Does your CEO know you let AI review your contracts?
I am noticing a new trend in how internal teams are using AI. Sales teams are drafting agreements and letting AI handle the negotiations. HR is creating employee-related policies and documents. Operations is using AI to deal with supplier agreements. Marketing is agreeing on advertising terms after a quick AI scan.
Everyone thinks they’re being efficient. They think they’re moving faster, cutting through red tape, getting things done. But here’s what they’re forgetting: when it goes wrong, you’re the one who’s accountable. Not the AI. You.
AI is great at finding issues that aren’t actually there. It draws conclusions that make zero sense from a business perspective. It can’t tell the difference between a theoretical risk and commercial reality. It doesn’t understand that sometimes you really do need to close this deal by Friday, even if the liability cap isn’t perfect. It doesn’t get that sometimes good enough is actually good enough, and that waiting for perfect means losing the opportunity entirely.
Did you receive legal’s approval?
Your CEO will always ask the same question: “Did legal sign off on this?” And by legal, they mean an actual person. Someone who’s done the work, understands your business, and can give you a practical answer you can stand behind. Someone who knows what matters and what doesn’t. Someone who can look at a contract and say “yes, this is fine” or “no, we need to fix this” and actually explain why.
AI doesn’t know your company’s risk appetite. It wasn’t around for the 80 decisions that came before this one—the decisions made after hours of discussions between all stakeholders. The decisions that never made it into writing because they were shaped by context and experience, not just data. The decisions that form the foundation of how your business actually operates, as opposed to how it’s supposed to operate on paper.
We all know that running a business requires taking risks. Smart risks, backed by a certain risk appetite that’s been calibrated over years of wins and losses. And you need someone who can tell you which risks are worth taking. Or at least what trouble the company will get into if you don’t listen to them.
So here’s the real question: do you trust AI enough that you want to take accountability for its decisions?
Because when something goes wrong—when the contract you thought was fine turns out to have a problem, when the terms you agreed to come back to bite you, when the policy you drafted creates a liability you didn’t see coming—the person answering for it won’t be the AI. It’ll be you.

Upgrade Legal