March 23, 2026

00:17:12

Contract Data Governance: Why AI Alone Won't Fix Conflicting Terms - E130

Show Notes

Episode 130 - When a company acquires another business, it inherits a tangled web of contracts — each with its own payment terms, clauses, and conditions. Contract one says 30 days, contract two says 45, and contract three says you never have to pay. So what happens when you point AI at this mess? In this episode of What Counts, we explore why contract data governance must come before AI deployment. AI is a powerful governance accelerator, but only when organizations harmonize their terms, configure their systems, and maintain human oversight every step of the way. Episode length: 00:17:12 Learn more by visiting our website, or by sending TrailBlazer an email at [email protected].
View Full Transcript

Episode Transcript

[00:00:01] Speaker A: Hello and thank you for joining us. Welcome to what Counts, the podcast where we explore the real world challenges and opportunities shaping information governance today. Each episode draws on our experience working across industries, turning proven strategies into practical insights you can apply inside your own organization. [00:00:21] Speaker B: You just sounded a little tired there, Lee. [00:00:24] Speaker A: It's Friday. You know how this is how it goes. Whether you're navigating information governance, facing a specific need, or simply curious about issues like email management, email management, retention, contract data, or asset data management, this podcast gives you clear, actionable perspective on what truly counts in building strong sustainable governance practices. This is Lee. In this episode, Maura and I are diving into one of the biggest shifts happening across organizations right now. AI as both growth accelerator and a governance test. All right, so I ran a little experiment and first of all, there's a big difference between who likes AI in this conversation and who doesn't really trust it just yet. And I'll let you decide who that is. But I ran an experiment and I asked an AI copilot to be exact. Maybe we shouldn't use names, but doesn't matter, it's too late. I said, give me an example of an automated workflow that used to take hours, but now with AI within one contained business, it could go much faster. And it gave me a great example. It said, a workflow, a customer support escalation workflow and ticket. And it said that it could read the entire ticket. It could search the CRM customer relationship management system for the customer's history. It can pull contracts from a contract repository, a document repository. It can identify relevant clauses that are relevant to the ticket. It can draft a response. It could decide whether to escalate to legal or not. And it can document the whole thing for audit purposes. This used to take multiple people and multiple hours to go back and forth to get this right. And it says after five to 10 minutes it can do this whole thing. So I read this to Maura and Maura was like, what did you say to that, Maura? Because I believed it. I believed what it said at first. I'm just telling you I believed it. [00:02:49] Speaker B: I said, I don't think we can talk about that on today's episode, because I don't think it's true. That's what I said. Then we started breaking it down. There are pieces that AI can absolutely do. An AI that is attached to your customer relationship management system, if you have one, can absolutely pull a customer's history. So can a servicing system like a ticketing system is also going to have access to that customer's history, or the CRM itself is going to have access to that customer's history. I was like, okay, it can do that. But I don't see that that's a big improvement. It's there already. If you have a CRM, you've already got that information at your fingertips. Second step, pulling the contract from the document repository. Here is where I have nightmares from a project that we did a little while ago about customer numbers in different systems at an organization being not the same. So customer numbers in the finance system might be one thing. There might be multiple customer numbers for the same entity in that finance system. Who knows why? It just happens. I do know why, but it's too much to go into here. Second, you might have a different customer number in the customer. In the CRM system, you might have a different customer number because they were a contact before they became a customer and they got a number there, but that number didn't meet the requirements of the financial system to be paid or to be. To take money from them. And you probably have a contract number and it might have another customer ID for the customer and the entity that customer is associated with who signed the contract or the contact that the customer is inside of that system. There's another system there. It's a number number there. So once you've gone outside the first system, the CRM system, now you have to match some data, and matching data in a different system is harder. It's much harder and it requires better data, cleaner data. So in your. In this example, Lee, do you think the data is all clean and there's a master data program going on? [00:05:12] Speaker A: I think there. In the example, I think there were a number of assumptions. Yes. And one of them being perfect pristine data. [00:05:20] Speaker B: Yeah. Which in all of our years of consulting, I don't think we've seen a system that a customer, a client that had that. But. But. Okay, we got that one. Then there's a security question for me because contracts have a lot of proprietary data. They have a lot of financial data that's not usually open to everybody. So does your customer service rep have access to the contract? I don't know. Also does. [00:05:53] Speaker A: So I got to interrupt because I just asked if the AI could rely on dirty data. What if it's dirty data and it's the multiple names and multiple customer numbers are all intertwined. So it says it hallucinates the gaps. [00:06:11] Speaker B: Yes, but it's very self. Aware. Self aware is a dangerous phrase to use with regard to an AI, but at least it knows it's at least advertising the fact that it's, it hallucinates the gaps. It's going to fill those gaps in with something that makes sense to it. [00:06:31] Speaker A: Right. [00:06:33] Speaker B: But which may not be true. So that's important, identifying the relevant clause. So that's an interesting piece too. Depends on a well worded problem statement, right. That the ticket has to be well worded. So we've got ticketing system, we've got CRM, we've got clm, we've got customer ID question marks and then we've got this relevant clause question which goes back to that original ticket. That's the piece that I actually think that the system that an AI can do in a not bad way is read a ticket, understand conceptually what's being looked for and identify the clause in a contract that, that addresses that question. I think that's where the AI large language models started was in things like that, building out of the fuzzy logic from eDiscovery tools, building even further back from keywords and context or keywords out of context. That chain of capability has built over time. I think that's pretty solid. But that's step four in the middle of this thing. Then we get to draft a response and decide whether to escalate. Those are judgments. When I brought that up, what did you say? [00:08:05] Speaker A: Well, I think it could draft a clause, I think it can make a response based on your current policy center, as long as those policies are all in one area as well and also available. [00:08:17] Speaker B: So now we're up to four systems that we're looking at. [00:08:20] Speaker A: Right, but the escalation is. That's odd to me. Right, so somebody would definitely have to step in and say if there's anything outside of these boundaries escalated to legal or, you know, the appropriate department or something like that. But somebody, again, somebody has to step in to say do this because somebody [00:08:43] Speaker B: has to step in now too. But now they're working off of less information but validated information. The original ticket that was written by the customer or that was written by the customer service operator talking to the customer. That's the most reliable piece of data we have going into this because that is what the person said, they asked this question and that's where the judgment happens now. So they're kind of close to the source. This judgment step coming at step six is after it's gone through, the AI looks at the ticket, looks at the CRM, looks at the contract, identifies a clause, drafts a response. So we're like five steps removed from the source We've had other sources come in, but we're counting on the AI to get it right. So that worries me. That worries me. I'm not saying it's not possible, but it worries me. Step seven, which I don't even know if you got to when you were listing the things was document the case. I actually think that's probably good at that. It can capture all the steps it goes through. So what does this mean, where we've gotten to yet? I'm not buying this scenario. I'm not buying this scenario based on the real world of messy data separated systems, security, access gaps. But those are overcomeable. Is that a word, overcomeable? Maybe those are able to be overcome with work. That's, that's where I'm landing is. Okay. This is something to strive for. [00:10:34] Speaker A: And I think you're absolutely right. When I, when I asked it about the dirty data, can it, can it see through this? It said AI cannot determine which one of is the real customer unless there's a golden record. And we could talk about that. Master data management, governance rules, entity resolution logic. So that's a whole new world. And metadata consistency. [00:10:58] Speaker B: Yes. So the golden record, I think we did talk about that before in one of our episodes about master data management. Because the golden record is where there are rules and audit trails and processes in place to make sure that the data in that record is correct, correct up to date and reliable entity resolution logic. That is a nice phrase. [00:11:26] Speaker A: Mouthful of a lot of stuff. [00:11:27] Speaker B: Well, it's a nice phrase, a quick three word phrase to describe in the nightmare scenario that we went through easily, I don't know, 350 hours of analysis using a variety of tools to match not just names, but also names against contract numbers, names against addresses, names against all the wrong IDs, names against all of the pieces of data that we knew about the entities to get to distill out that golden record for this real set of viable entities in the large data set. I like that entity. And it was logical we, we did it in iterative batches. We added in different pieces of data to help us do it. The fact that we can outline those steps means that at some point an AI may be able to carry those steps out faster. God, I hope so. Faster than the 300 plus hours that we took doing that. But it still has to be done. It's not going to happen on the fly. It has to happen ahead of time. Because the AI trying to answer this question can't go off and do a whole entity resolution project to Find out if they've got the right entity, right customer to get to the right contracts, then there's probably more than one contract for any given customer in the repository. So once you've got the right customer, you have to go back to the ticket and say, well, which contract is going to cover this question? What if the question is kind of generic and it crosses all the contracts, but the contracts are conflicting? Which is very likely. Again, in real world situations where contracts evolve over time, sometimes contracts are assigned or sold from one group to another. The original group has a set of terms, then they buy a set of contracts from another group that had slightly different terms. Unless the organization has made an effort to harmonize and normalize all of the terms, which is essentially creating new contracts with the buying entity, they still have all the conflicts. And that means that they get to a question and they may not be able to answer it easily. And then it for sure has to go to legal. But even legal is not going to know the answer. They're going to say, oh no. Contract one says 30 days and contract two says 45 days and contract three says we never have to pay or something. Who knows? So yeah, I think that, I think this is a, a boon. I agree that is a governance accelerator, a growth accelerator, and a governance accelerator, but it is not. Let it go and it'll solve your problems. [00:14:39] Speaker A: Absolutely. Integration, configuration, governance, orchestration, human oversight, all these things have to be set up [00:14:52] Speaker B: and thought through in advance. Like, actually, I really do like this scenario because it pulls a lot of different parts of the company in and it brings up all the, all the favorite problems. Like customer history is a huge thing. If we had something in here that was about location, that would just be the whole, the whole problem set. Like if some of their contracts were in one state or another, then different clauses might apply. And does the AI know if the customer where. Regardless of where the customer is located, it really might be about where the contract is located. Except if you're in California, the privacy data rules apply regardless of your location. So it's complex is, I guess, the best way to put it, but I think it's aspirational, something to work toward in small pieces. Looking at, looking at your current data and getting it ready and then applying AI to the small pieces before you try and bring them all together. [00:15:59] Speaker A: That makes sense. I think that's good for today. I think that was an excellent episode. [00:16:05] Speaker B: All right, well, thank you for that very thought provoking example. Although you forgot that we were supposed to just both burst into laughter when we talked about the data all being clean. Yeah, I guess we had used that up in our prep. [00:16:20] Speaker A: Well, we used it up in my introduction of the whole podcast. I mean that's true. [00:16:24] Speaker B: Anyway, kind of we wore you out in the in the prep is what it was. That's what was coming out in your introduction. All right. Well, thank God. [00:16:34] Speaker A: Yes. Please send us an email at info trailblazer.us.com or look us up on the web at www.trailblazer.us.com or look us up at the Learning Academy which is Trailblazer learningacademy.com Thank you for listening and please tune into our next episode if you like this episode. Be a champion. Share with people in your social media network or like or subscribe to our podcast. It's always helpful. As always, we appreciate you the listeners. Special thanks goes to. Jason Blake created our music. [00:17:06] Speaker B: Thanks everyone. Look forward to learning more with you the next time.

Other Episodes