Andy Cooke on the Future of In-house in the Age of AI

Andrew Cooke (Andy) is the General Counsel at TravelPerk, the world’s leading B2B travel management platform. Andy delves into the pivotal trends driving the success of AI for internal teams, recounts TravelPerk's experience in rolling out AI in early 2023, and shares some predictions for 2024.

Andy Cooke on the Future of In-house in the Age of AI
Andrew Cooke (GC @ TravelPerk) and Charlotte Kufus (Co-founder @ Legal OS)

Andrew Cooke (Andy) is the General Counsel at TravelPerk, the world’s leading B2B travel management platform. Andy delves into the pivotal trends driving the success of AI for internal teams, recounts TravelPerk's experience in rolling out AI in early 2023, and shares some predictions for 2024. This text has been adapted from a webinar transcript.

First, it's interesting to explore the trends that have emerged over the last decade or so, and that explain why we see, certainly in TravelPerk, this tremendously enthusiastic uptake and engagement with Legal OS and AI products generally. 

2023 was a year of being humbled by the power of technology. (...) I was clapping my hands like a fat baby during this entire period. It was amazing.

The service we provide as lawyers is subject to some of the same trends that apply in our lives as consumers. Very generally speaking, the SaaS revolution introduced a world of zero friction to the purchase of services. The example I always provide is Uber. Before Uber, you held your hand out on the street or called a taxi dispatcher. In the latter case, you'd have no idea when that taxi is going to turn up. Generally, you'd have no idea how competent the driver is, how old or new the vehicle will be. Now, Uber gives you all that information in a way that is extremely convenient to you, and people experience versions of Uber across many different types of services; everything from dating, to finance, to last-mile food delivery, all those experiences influence how consumers engage with products. 

Now, think about your regular legal tech product. Your LinkedIn inbox is full of people trying to sell you landfill legal tech stuff. There will be a very high implementation cost, and a lot of the “benefit” will be focused on you, the legal team. But the legal team doesn’t exist to serve its own needs. You exist to help your colleagues achieve their goals and help your employer win. Not enough lawyers focus on the needs of customers.

...the legal team doesn’t exist to serve its own needs. You exist to help your colleagues achieve their goals and help your employer win.

So what are the needs of customers? They are increasingly for on-demand solutions that don’t create a heavy degree of friction. There’s plenty of data to support that as an overall trend in consumer behaviour. So, we want to strip friction out of the customer experience, and ideally we want to create a curated customer experience that gets them what they need, when they need it. 

In a pre-AI world, the experience might have been a decision tree-based chatbot, serving information we know customers need.  But using decision trees requires us to define and curate both the decision tree and its endpoints. Or it might be a contract lifecycle management system (CLM), where we flip around the workflow and allow people to self-serve simple documents. But again, there’s lots of stages in putting those documents together, and it requires people to interact with systems that don’t belong to the company. It might not be done out of Slack or Salesforce. It’s being done out of something like Ironclad or Juro, so a system which belongs to legal and is therefore an alien environment to the customer, an environment involving friction. 

In the wake of the ChatGPT revolution, around January last year, we started to consider the ways in which we could introduce AI to meet the needs of our customers for on-demand solutions. At that time, the product experience that Legal OS has been able to offer us wasn’t immediately available. It wasn’t something we could get off the shelf. We started to see versions of it, but with very low quality control in terms of delivery outcome. And of course the goal is not to have lawyers in the loop of the solution, but to get lawyers out of the loop, because only then can it be scaled to the on-demand experience we’re trying to deliver. 

...by April we were already full enterprise and serving hundreds of queries to people in a deeply satisfactory way. 

What happened next was humbling. 2023 was a year of being humbled by the power of technology. Anybody that’s done a lot of big ticket implementations will be able to tell you that it’s awful. You go through this war zone of planning and implementation. So you go in with the expectation that, when you bring in a technology, there’s going to be a lot of deluxe legwork in getting it to a place where it becomes useful. That’s the mindset. We expected to go through an extended implementation period with a small subgroup of users before we could go full enterprise. In February or March, we were expecting to have a very basic solution for simple tickets scaled by about September, and there would probably still be a lawyer in the loop. As it turned out, by April we were already full enterprise and serving hundreds of queries to people in a deeply satisfactory way. 

I was clapping my hands like a fat baby during this entire period. It was amazing. We got to a place where we were serving on-demand, but in a way that is also really intuitive, requires nothing of the user. I’m just asking questions as I did before, in the place where I like to work, in Slack, or Chrome, or whatever. No training required. I don’t have to go out and teach people how to use this tool, because a child could use it. The work is taking place in the background, where we continually improve and iterate on the experience. And it’s been a long and spectacular ramp up from there. We’re currently averaging about 3000 queries a month, so one every five minutes, more or less. Our peak is 5000 queries a month. 

To give you some context, TravelPerk is a company of about 1500 people. In 2022, we hired three people a day, and we’re about to hire several hundred more. It’s very reassuring to me, as General Counsel, and to my team, that we won’t have to significantly change this tooling to allow it to scale for those people. In fact, it can actually improve on the onboarding experience for those people, because it makes access to information easy. Our total addressable market as a legal team massively increases. Before, we might have had a community of 50 super users who use our tools the most and come to us with the most complex set of queries. That’s now one of our user personas. There is also a substantial group of users we’ve never really interacted with, but who derive tremendous value from just being able to access information through tools. You can have the most perfect policies in the history of policy, but if people can’t access and query them in a way that works for them, in the timezone and language that suits them, then what’s the point? 

The democratisation of this kind of information doesn’t only return value by allowing us to work at the top of our licence, it also creates value for people who used to have to enter tickets in other departments, like security and privacy. It takes work off their desk. But it also maximises the impact of all the time you’ve invested in creating these beautiful policies, because it allows people to actually read and engage with them. Nobody’s going to sit down and read your conflict of interest policy, then somehow locate the author and ask them a bunch of questions about it. A 22 year old sales representative based in Barcelona isn’t that careful. So we’re adding value all the way through the chain. This is way bigger than just the legal team doing cool shit with AI. This is a value release and a value driver across the organisation. 

From August 2022 to today, we’ve reduced time to close by 82% overall. Information quality is a massive driver of that. 

Our sales teams love on-demand. It’s magnificent for them, because we brought down time to close. Time to close is the period between when a contract is initiated and when it’s signed, a metric for the overall negotiation period. We brought down time to close with intelligent use of CLM and contract design quite substantially, but what we hadn’t been able to deal with, before the launch of Legal OS, is that the process of negotiating a contract isn't just about that core contracting workflow. It's about all the information that sits around it, all the questions the customer will ask before getting into contract stage or in parallel with it. For example, things like ISO certifications, or even simple product queries, if you don’t answer these questions quickly, the contract might get stuck with a member of the team that has a particular bee in their bonnet about how we handle customer data, which is quite reasonable, of course. We should have good answers to those questions. We shouldn’t just drive the velocity of the central contracting part, but also drive the velocity and create a great experience for our internal users, to help their prospects and customers get to a yes decision. From August 2022 to today, we’ve reduced time to close by 82% overall. Information quality is a massive driver of that. 

The role of these technologies in changing the branding and perception of the legal team is also quite important. Our DNA is change, demystification, happiness. Everything goes through the lens of that DNA. Demystification is making services easier to access, making interacting with the legal team a pleasure rather than a burden. Those kinds of things are tremendously valuable in improving your presence in a company. It allows us to flip around some of the lazy stereotypes that still persist about in-house legal teams. We’re the AI guys now. Once you start delivering great customer experience, those stereotypes just fall over. 

So, thinking about the underlying trends of 2023, what does that tell us about 2024? I think we can expect the customer expectation, of friction continually being removed from processes, to continue. The very wonderful Tom Rice shared with me something from the Consumer Electronics Show called Rabbit R1, which is a little box that uses something called Large Action Models to activate apps in your phone. Essentially you can tell it to order you an Uber to come in ten minutes, book a restaurant for half an hour later, and send a text to your mother saying you’ll be late for dinner. And it will just do it for you. It’s $159, and they can’t make them quick enough to ship them. 

So now we’ve removed the friction of people literally going into an app on their phone. Of course, eventually Apple’s Siri or the Google equivalent will have this sort of stuff integrated. We’re moving towards a space where even entering an app is too much friction. I think we can extrapolate that there will be ever lower levels of tolerance for users to participate in tools that are not where they want to work, and to do tasks that could easily have friction removed from them, where there hasn’t been an effort to remove that friction. Sending out an ms word document to someone and asking them to “please advise” just isn’t going to cut it for people anymore. You’re going to have to curate an experience for them to interact with that kind of document. We can say that’s people being lazy, or we can roll our eyes about it, but that’s the reality of how people will expect to interact with legal functions. 

Accordingly, we have to continue to build to remove friction. And we have to think, as the technology of Large Action Models or actions on-demand starts to commodify and consumerise, what’s going to be expected of the existing tools. We already see this with people who started using Legal OS as a search engine alternative and are now using it as a copilot for all their legal demands. We’re going to have to respond to that. What can we do with the enormous amount of data that 5000 queries a month can generate for us, and how can we use that to better tailor our tools? How do we integrate the other tools we may require, that fill in the gaps as AI gradually builds its own competency set? And again, in the first few months we launch, people will be very surprised and excited by that stuff, but then they’re just going to take that convenience, build it into their workflow, and expect more of the products. So we have to keep on pushing forward. 

Find out more about Legal OS at legalos.io, or book at demo.

For other enterprise use cases, discover
inhouse.chat.