Peter Aprile and Natalie Worsfold interview Dan Katz, associate professor at Chicago-Kent College of Law, director of the Law Lab, and an affiliated member of CodeX at Stanford.
The group delves deep into Dan's passion Fin (Legal) Tech (i.e., the financialization of law). Fin (Legal) Tech takes a data-driven, rigorous, and quantifiable approach to risk assessment and predictive analysis. And why, should lawyers care? Well, it's simple: if lawyers can better predict outcomes, they can put skin in the game, and reap significant financial rewards.
Professor Katz is a scientist, technologist and law professor who applies an innovative polytechnic approach to teaching law - to help create lawyers for today's challenging legal job market. Both his scholarship and teaching integrate science, technology, engineering, and mathematics.
Professor Katz's forward-thinking ideas helped to earn him acknowledgement among the Fastcase 50, an award which “recognizes 50 of the smartest, most courageous innovators, techies, visionaries, and leaders in the law.” He was also named to the American Bar Association Journal's “Legal Rebels,” a prestigious group of change leaders in the legal profession.
Peter Aprile is a senior lawyer specializing in tax dispute resolution and litigation. His vision as Counter’s founder and his everyday role at the firm are one and the same: to be an agent of change, uncovering opportunities and developing strategies that achieve more than anyone expected. A creative thinker, Peter studies problems from all different angles to find what others have missed. He’s also convinced that he likes winning more than most people.
Different people describe Peter in different ways. At the CRA and the federal Department of Justice, the word relentless comes up quite a lot. Admittedly, so does the word a**hole – but it’s often said with a certain grudging respect, if not affection. Peter’s clients call him a saint. Well, some of them, anyway. His colleagues describe him as empowering and harddriving, but fair. Peter’s friends call him loyal. His wife describes him as a lot to deal with, but worth it. Peter encourages his young daughter and son to call him “The Big Homie,” though with limited success. His mother describes him with the single word mischievous – before going on to complain that he should call more.
Natalie is a tax lawyer who represents individual taxpayers and owner-managed businesses in disputes with the Canada Revenue Agency (CRA). She also successfully challenges CRA decisions denying taxpayer relief and helps facilitate applications under the Voluntary Disclosures Program.
But what you really need to know about Natalie is that she’s a tax litigator with heart. When she takes a case, it’s not out of technical interest – it’s because she cares. And if she believes the government has got something wrong, she won’t stop until it’s been put right. She’s fierce.
Natalie is the co-architect behind many of Counter’s process workflows, software and data analytics systems, as well as our comprehensive knowledgebase (loving named Hank). And when it comes to preparing cases, she’s Counter’s secret weapon – happiest when elbow-deep in evidence, meticulously building creative solutions to seemingly impossible problems. Because the fact is Natalie sees things that other people don’t.
Natalie’s family and friends describe her as loyal, selfless, understanding and fun. They also mention stubborn. To her Counter colleagues she’s a combination of stellar brainpower and contagious enthusiasm who elevates the game of everyone around her.
People
- Chistopher Columbus Langdell
- Malcolm Gladwell, Blink: The Power of Thinking Without Thinking
- Paul Lippe
- Richard Posner
- Yoni Moussadji
Tech, Tools & More
- CodeX, The Stanford Center for Legal Infomatics
- CounterMeasure
- ELM Solutions, Wolters Kluwer
- Fin (Legal) Tech Conference
- The Good Judgment Project
- Innovation in the Legal Services Industry - The Future is Already Here, It Is Just *Not* Evenly Distributed
- The Law Lab
- LexPredict
- LexSemble
- Loss-of-Value Coverage
- Premonition
- Reinvent Law Laboratory at Michigan State University
- Sky Analytics
- Traction Strategy
[music]
Peter Aprile: [00:08] Hi, and welcome to "Building NewLaw," Canada's first and only CPD accredited podcast. It's hosted by me, Peter Aprile, and my colleague, Natalie Worsfold.
Natalie Worsfold: [00:17] In each episode, we interview lawyers, legal technologists, and other like‑minded people at the forefront of new law.
Peter: [00:24] We hope that the podcast connects the new law community and helps us all learn more about the approaches that are changing the way that we practice law.
Natalie: [00:33] To learn how you can use this podcast to satisfy your Law Society CPD requirements, visit our website at CounterTax.ca/bnlcpd. That's CounterTax.ca/bnlcpd.
Peter: [00:40] Enjoy the show.
[music]
Sponsor: [00:41] The Building NewLaw podcast is supported by Counter Tax Lawyers, a new type of tax, controversy, and litigation law firm. To learn more about Counter, go to CounterTax.ca.
[music]
Peter: [01:09] Today we're speaking with Dan Katz. Dan is an associate professor at Chicago‑Kent College of Law. He's a director of Law Lab and he's an affiliate faculty member of CodeX. He's also the co‑founder and chief strategy officer of LexPredict, which has a great product called LexSemble which I suggest that everybody check out.
Natalie: [01:23] Dan is a co‑founder of Reinvent Law Laboratory at Michigan State University. His teaching has really changed the way that students are learning law.
Peter: [01:31] We first became Dan Katz fans after watching his presentation, which you can find on YouTube. It's called "Innovation in the Legal Services Industry." In that presentation, Dan argued that in order for law firms to compete they need to evolve and that we need to rethink the partnership model and its inherently short term under investment mentality.
[01:50] Instead, we need to build sophisticated process and data driven entities. Sound like any law firm you know?
Natalie: [01:56] [laughs] Dan's new passion is the financialization of law. Barring from developments in the financial sector, Dan is arguing that it's time to rethink the economics of law and to create measures of the value of legal services.
Peter: [02:10] Dan recently spoke at the Emerging Legal Tech Forum in Toronto. I went to that conference specifically to see Dan. I didn't tell him that, though.
[02:18] Dan, again, gave a great presentation. I tracked him down at the conference, had a great conversation with him about data, law, risk, and asked him to be on the podcast.
Natalie: [02:27] Let's get straight to it. Here is our interview with Dan.
[music]
Peter: [02:36] Dan, thanks for being here. You've spoken a lot about the legal industry and where's been, where it is, and where it's going. One of the things that you've spoken about is the lawyer's value proposition. Can you tell me about what you see the lawyer's value proposition as?
Dan Katz: [02:52] Most lawyers do one of two things, reverse engineer complexity for people, something that looks very complex they know how to do, and/or help manage risk. Enterprise, legal, and other types of risks. Those are the two things I would say that a large percentage of lawyers are engaged in.
Peter: [03:10] When you say the value of lawyers is managing complexity as well as risk, are lawyers doing a good job at that?
Dan: [03:20] The complexity of legal processes, some of that is set exogenously by lawmakers or regulators. And so, I think that we've seen a massive growth in complexity in the world. That has actually put a lot of pressure on existing legal processes' ability to scale at the same rate at which complexity is increasing.
[03:41] In some sense, that's the entry point for technology, which has seen certain types of problems our existing legal processes will not be able to handle the size and complexity. If you look at, say, ediscovery as an example, the dominant form of business communication shifted towards email.
[03:59] When we get into these cases, these reasonable sized pieces of litigation, there just wasn't an ability, given how much email there was, to use the old methods.
[04:08] I suppose if somebody's going to pay you to review 10 million emails, then go ahead and run the meter but the clients, quite appropriately, have balked at that. That has come in the legal industry for ediscovery tools, contract review tools, things of this nature, are actually designed for the work that lawyers do.
[04:26] That's what makes it unique whereas telephones and Dictaphones and email is something that was available in the world and lawyers just happened to use.
Peter: [04:34] If we're talking about the advancement of legal tech and the products made for lawyers specifically, what are the clients getting out of this? They're getting increased efficiency from their lawyers. They're getting...
Dan: [04:45] Quality, too. Don't forget, everybody harps or locks in on efficiency. We also can get better quality. Part of what has not been done is we haven't really exposed the error rates that the existing processes people follow actually generate.
[04:58] When you actually put some of those things on trial, you realize, "Wow, we were creating a lot of risks with some of the practices that we were actually engaging in historically." But nobody was keeping score. If you don't keep score, you don't realize that your house isn't nearly as in order as you might think it is.
Peter: [05:14] Yeah. Can you talk a little bit about that in terms of...? You've talked before about transparency issues and the ability to see that quality or see how, I guess, a sausage is being made.
[05:25] When I think about what you're doing or what you've been advocating is something that we've seen, as well, and it's that idea that there needs to be a greater transparency between the parties.
[05:36] It's important that clients have a better understanding of what lawyers are actually doing and how they're doing it. I think you see, we see the same thing, as well, is technology giving lawyers the ability to show their clients, and giving clients the ability to see it in a way that hasn't really happened before.
Dan: [05:54] It's true that there are people who have made a living off of opacity, and the world, more broadly, is becoming about transparency. I would say that lawyers historically hate transparency. They hate it not just for their own business, but their basic intuition is to lock information down and to control the flow of information, but the world is saying that's not how it's going to be.
[06:15] A lot of lawyers have struggled with that in a more general sense of practicing law. Now, if you talk about the business of law, people have made money off of opacity, and I would say if your business model is linked to making money off opacity, you need a new business model.
[06:30] You've got to get back to what is your value proposition, and any time you get too far away from that, you're basically having unstable business. I'm not one of these people who thinks that all the lawyers are going to go away, or something like this, but I think people have to really think about getting closer to where your real value proposition is.
[06:47] If you say, "What I can do is reverse engineer complexity for you," and there's a lot of complexity in the world, and I can do that through a mixture of my own expertise and a set of tools, technology tools or other types of tools, that's a fairly sound business.
[07:05] To the extent your business is about a bunch of other stuff, I have serious questions about its long‑term viability.
Peter: [07:12] That's something that we try to do, giving our clients full transparency, and my question to you is do you see a lot of lawyers moving in this direction?
Dan: [07:20] No, I don't see a lot of them. I see some of them doing that, and I think there's a business opportunity to be successful in a world that is demanding more transparency across all types of forms of human endeavor.
[07:33] To be in the transparency business is a good business to be in. To be in the opacity business in a world that is becoming more about transparency, it feels very anachronistic and not long for this world. That's my view.
Peter: [07:45] Is there anybody in particular that you see that's been moving in this direction that you can identify?
Dan: [07:50] Some of the data providers, themselves, like Sky Analytics and TyMetrix, which is now part of Wolters Kluwer's ELM, but they basically are data providers that aggregate billing data and giving it to enterprise clients to try to create, essentially, transparency where there hasn't been transparency.
[08:09] Transparency as a service, if you will. That's the new SaaS model, I suppose, out there.
Peter: [08:14] I think that's a great first step, but when I look at this, and when I think about this, and when I look at what these third parties providers are offering the market, it's amazing. It's a wonderful first step.
Dan: [08:24] I want to jump in, if I can. I would expect third parties to be the initial providers, because if you have a business based on opacity, you have to cannibalize that business, essentially, to move the transparency. It's much easier if you're a third party to say, "No, no, we'll just sell transparency, and that's going to be the business we'll be in.
Natalie: [08:42] Why aren't people more transparent? What's the barrier there?
Dan: [08:46] Money.
Peter: [08:46] Money.
Dan: [08:47] Because people make money off of opacity right now, and if you make money off opacity, you do not want to move to a world of transparency because it's threatening. [laughs]
Peter: [08:57] The other thing I think about is this, if none of the major players in this market move in this direction, then the rest of them don't have to either. [laughs] The structure is support each other in promoting this opacity.
Dan: [09:11] I would really expect to see over the coming decade a real break of ranks because there have been successful...You asked me for some examples. I know a few examples, but they're things I happen to know on a proprietary basis, so I don't really want to discuss them directly.
[09:26] But I will say that a number of firms who are very traditional have made some pretty good money on success fess, which means they're doing litigation finance by implication. A place you would never expect to do that.
[09:39] It starts to reveal things, because when you have a contingency fee, you have skin in the game, and you care a lot more about outcomes than you ever did before, and I like that. This is part of my broader argument about law is finance and law should look more like finance.
[09:53] When people have skin in the game they tend to do better, they tend to care more. It's hard to care as much when I make money, whether I have success or failure, than if, "No, I only make money if I have success." That sharpens your sword a bit, or gets you much more focused on achieving good outcomes.
[10:10] I wish that everything had a mild contingency aspect to it. I think that people without a direct economic incentive do not develop the types of innovations, the types of improvements in the way they do things, their work, than when they actually have money on the line.
Peter: [10:27] I don't think that's a slight against lawyers at all. I think that's human nature.
Dan: [10:30] All I'm saying is, "Gee, lawyers are people, too."
Peter: [10:33] [laughs] Nobody will believe that.
Natalie: [10:38] When I look at lawyers hiding, and not being transparent, there's two things that jump to mind. The first is, obviously lawyers will say, "It's so complicated I couldn't possibly explain it to you." The other one tends to be a professional liability argument.
[10:52] What reasons have people been putting forward to you as to why they're not embracing transparent, other than money?
Dan: [10:57] I don't think they really even think about it as a lack of transparency. They're going to state their stock card of reasons that you can pull out and play, like, "Here I have the seven of spades. Here I have the jack of diamonds."
[11:08] They're just going to play the cards they're going to play, which are these narratives which, if you put a little bit of pressure on, don't really hold water. People will say, "I have a special professional responsibility in this area." "Oh, OK."
[11:22] To what extent is that actually implicated in virtually any instance? No, not really. It's not really implicated. It's just something that sits out there on the horizon, but it's not something that really would be implicated in most instances. Only in theory is it implicated, but it becomes this specification to essentially run a bad business.
Peter: [11:41] I'll take that a step further. This whole discussion ties into what you've talked about in the past, which is this transition from lobbying art to science. There's a part of me that thinks this lack of transparency in part has to do with the fact that some of the analysis that's being put forward isn't that sophisticated or that deep.
[11:59] Because of that, I think that there is a protection of what the analysis is, the rigor of the analysis, or what makes up portions of the analysis. Can you speak a little bit about that, and maybe about the transition from art to science?
Dan: [12:11] I feel law is basically a qualitative and humanistic exercise as it's taught, and as it's practiced typically. That is not a very sound thing, in my opinion, because there are many instances where one can do a better job on the task at hand by applying basic tools of science, technology, engineering, and mathematics to the question, and blending that with the humanistic element.
[12:36] This purely humanistic, purely qualitative and rhetorical exercise that people engage in is just not sound.
Peter: [12:44] That's amazing to me when you start to realize that about lawyers, this more qualitative analysis. People outside of the law assume that it is more of a quantitative analysis. The question is, how do think we got here? How do you think we've gotten away with this for so long?
Dan: [13:00] That's a very multi‑dimensional issue. Part of it is the educational process, the Langdellian model. Langdell was the dean of Harvard Law School around the turn of the 19th into the 20th century. He developed the case method.
[13:17] You read the cases, and you essentially analyze, mostly the reasoning of appellate lawyers. He actually felt like law could be a science. He wanted law to have that status. The problem was, the exercise people were engaging with was basically rhetoric.
[13:31] It just stayed into that, and it became the battle of the arguments, was the way that the law developed. It wasn't particularly scientific. In fact, a lot of people said you can't be scientific. They challenged what people were doing and saying, "You're acting like this is a science, but it's not a science."
[13:46] They were right about that, but what they missed was, it could be. It could be a science. It could be treated like a science. Richard Posner comes in in the 1970s and brings law and economics into the legal academy.
[14:01] He says economic analysis offers some insight into the law. Mostly, what that became is economic analysis is useful for evaluating public policy. Less so, and the thing that I'm very interested right now, is saying what lawyers are really doing often is providing some class of finance or insurance to their clients, and trying to help them manage enterprise legal risk.
[14:23] That thread was not picked up to the level that it should be, because lawyers wanted to be policymakers, and law professors wanted to train people to be appellate advocates. That's what they thought was the highest and best calling for people.
Peter: [14:36] This is your idea, and I didn't pick up whether you said, "Hasn't been picked up," or, "Isn't being picked up."
Dan: [14:42] It wasn't picked up to the extent I have seen a concentrated interest in this idea that what lawyers do is essentially is finance. What lawyers do is essentially insurance, risk management, things like this. You'll see an occasional paper here, occasional paper there, occasional conversation here, occasional conversation there, but not a concentrated effort.
[15:03] I really don't understand it, because if you just look at many of these processes, you would just say to yourself, "That looks a heck of a lot like insurance to me," but it's not real insurance, because there's not rigorous underwriting going on.
Peter: [15:17] Can you give an example of that, just to tease that idea out a little bit? I think that this is going to be a foreign concept for people who are listening. Even when I heard you talk about it in the first instance, I thought it was true.
[15:27] Second of all, I thought that's a really interesting way to look at what we're doing as lawyers. Could you maybe provide some context for that, or maybe an example of that?
Dan: [15:35] For the lawyers that love taxonomy, let's start with a taxonomy. My friend, Paul Lippe, who writes for the "ABA Journal," "New Normal" blog, and was a Silicon Valley general counsel, among other things, says there's three types of lawyers.
[15:48] A mediocre lawyer goes around a plays whack‑a‑mole. What he means by that is, they go around and say, "This is risky. This is risky. Oh, we can't do that. This is very risky." They basically are just parroting back this law school of being an issue spotter, and flagging risks for people.
[16:08] A merely clever lawyer distorts external perceptions of how risky something is. Maybe on a deal, a transaction, they're able to make a company look better than it is, or maybe a contract term look better than it actually is.
[16:24] A clever lawyer helps their client price risk. Price risk, what do I mean by that? Business is risk inherently. Telling your client that something is risky isn't a particularly instructive exercise. If you can't actually quantify or characterize the nature of the risk, on what basis are you making a decision about whether to go forward?
[16:46] Of course, if they're engaging in straightforward illegal conduct, then you should just tell them not to do it, but most things aren't that. Most things are, "Gee, we might be exposing ourselves here. Gee, this might not be a prudent decision."
[17:00] It needs to be seen against the broader context of the benefits and costs of a particular exercise. We're typically not able to put a numerical value on these risks. I see this showing up in lots of things. People say. "We can't do that. There's reputational risk to the organization."
[17:13] How much? What sort of reputational risk? What's the potential benefits? It usually just falls down on a lawyer says it's risky, and then that's all they can bring to the conversation.
[17:23] That's why the, let's call them non‑lawyers, also known as the CEO and CFO, say, "I'm not going to talk to them until I absolutely have to, because all they do is throw a wet blanket on everything, and prevent us from doing basic things."
[17:35] Again, I want to make sure that I'm clear. If the people want to engage in illegal conduct, that's your role to stop them, but a lot of stuff is far from that. It's just something that maybe you don't think is prudent.
[17:45] We want to be great lawyers. We want to help people price risk. Now, you say, "How do we price risk?" Back to the mediocre lawyer will say, "You can't quantify this." If you can't quantify it, then how do you ever make a decision about anything?
[17:59] My argument is you already are quantifying it implicitly. The problem is that you're not using the tools that have been developed over many hundreds of years to actually manage risk.
Natalie: [18:11] If people aren't quantifying risk right now, is this the guesstimate and the gut analysis that's going on?
Dan: [18:18] Yes, typically it looks like intuition, hunches, experience, but what I think is really damning is law's full of the cult of single‑person expertise. You would not run an insurance company on the basis of, "We're just going to ask one single expert, and then underwrite policies based on the decision making of a single expert." That's a very unsound practice.
[18:42] Again, we have a lot more history in managing risk in the insurance space. You ask yourself, "Well, what caused insurance companies to succeed or fail in characterizing risk?" They got a long way away, typically, from cult of single expertise.
[18:54] They got into modern tools of finance. Doesn't mean they haven't had some spectacular failures along the way. My argument is laws actually add as many or more spectacular failures, but we don't keep score. Because of opacity, people are often able to hide the fact that they've done a very poor job on things.
Natalie: [19:11] I guess there's blame, too. If you're going to one person, and they get it wrong, they're a bad lawyer, then, at that point, right?
Dan: [19:17] That's right, but it's actually systemic. One of the things that I've argued for is if you look at what would be called manual underwriting, manual underwriting in insurance has these properties to it. Define a set of information that you need in order to characterize the risk of something.
[19:35] If you've ever gotten a mortgage on a house, the mortgage broker's going to collect, say, 19 pieces of information about you and about the deal. That becomes the set of information on which somebody scores the risk of this particular deal, and some mixture of an algorithm and a person is going to decide what the risk level is and what's an appropriate level of interest rate to charge on a particular deal.
[19:58] Let's look at something else. Let's think of something really exotic. There are college football players in the United States that are able to get a loss of value or a loss of draft status policy.
[20:11] I'm projected to be the 10th pick in the draft, but if I get injured, I can collect on a policy if I am now drafted 100th instead of 10th. How does one quantify that?
[20:23] They go out, and they try to solicit a bunch of different people's opinion as to where somebody would fall in the draft. They try to go out and collect data on what the difference between being drafted 10th and 100th is, and then they say, "Well, at what rate?"
[20:36] What is the percentage chance a person drops from 10th to a 100th, based on both historical data and a panel of experts consulted independently?" That's how one sets a value on what those premiums ought to look like.
[20:49] We can do this stuff in law across a whole range of big choices, but we have to get away from the cult of single‑person experts.
Peter: [20:56] There has to be a willingness to do it, and there has to be a willingness, even if we can't get out of that single‑person expert as quickly as we all would like.
[21:04] There needs to be a willingness of that person that's giving the probability, or the advice, or pricing risk, as you say it, to be held accountable.
[21:11] Again, that bleeds back to the transparency argument, but when I hear you talk, and when I think about what we do here, and this idea of how do you price risk, my answer starts and stops at you price it by trying to price it. This might not be perfect along the way, but the first step, and the first important step, and all we need to do as lawyers is try.
Dan: [21:32] Yes, one of my arguments is you don't need technology to do better underwriting. You need a mechanism to solicit independent feedback on decision‑making, and the key is to make sure that you don't create a context where people are refracting back your view.
[21:48] Some people say, "No, no, no, it's not just my decision. We had a meeting. We talked about it, and this is the group decision," but if you really decompose the meeting, what it was is the senior person asserted their position, then it was up to everybody else in the room to decide whether they were going to challenge, essentially, their boss as to the validity of their view.
[22:07] Then we acted like that was some sort of group decision. It was not. It was from that, so one of the things people should be doing is collecting independent judgments before having a meeting about where people's position is.
[22:20] The idea is human capital in this field is not being appropriately leveraged. We're not actually leveraging the expertise we have. We have a lot of smart and creative people, but we basically say, "No, we're going to have the cult of a single expert drive decisions with massive financial consequences," and, again, that is a deeply unsound practice.
[22:39] [background music]
Natalie: [22:39] We hope you are enjoying the podcast. We want to take one minute to tell you about a new segment that we're adding to the end of each episode. It's called "I am Building NewLaw," and it gives you the opportunity to share what and how you are changing the practice of law.
Peter: [23:02] If you're building a new law firm, legal tech company, or product, we want you to record a message and tell us all about it, and we'll add it to the end of one of our podcast episodes.
Natalie: [23:17] Go to countertax.ca/iambnl. Click on the voicemail button, and tell us what you're building and why.
Peter: [23:24] We'd love to hear about your experience so that we can share it with the whole NewLaw community.
Natalie: [23:31] Remember, stick around until the end of the show to hear how one of our listeners is changing the practice of law.
Peter: [23:22] And now back to the show.
[23:25] [background music]
Pippi Scott‑Meuser: [23:25] Pippi here, producer at Building NewLaw. For all you BNL super fans, you'll remember me from Peter Carayiannis' episode. I've got one more exciting announcement to make before we can get back to the show.
[23:38] As you know, if you're a Canadian lawyer, you can use this podcast to satisfy your Law Society CPD requirements by simply listening to the episode and completing a short quiz.
[23:48] As you probably also know from listening to Peter and Natalie, we're constantly looking for ways to do things better, faster, and with greater efficiency, so in an effort to make completing your CPD as seamless and easy as possible, simply text 0204 to the number 647‑800‑2265, and we'll reply with a link to your CPD quiz. It's that easy.
[24:13] Again, text 0204 to 647‑800‑2265. Now we can get back to the show.
[24:23] [background music]
Natalie: [24:23] You mentioned that you don't really need a lot of tech in order to do this. What's the best way to get a group of independent people to take a look at a problem and then put a number on it?
Dan: [24:36] My company, LexPredict, has built a platform. It allows people to do independent crowdsourcing from within an organization. Think of it this way, you create a set of information that forms the basis for deciding what the risk or exposure on an event is, whether a regulatory outcome's going to happen in a particular sort of way, or any other question, a transactional question, or what have you.
[25:01] The key is to have that information set given to a person, and have them render a judgment, and you can specify what judgment you're looking for. It could be an outcome. It could be asking what theories, or arguments, or what support do you have.
[25:15] The idea is you'd solicit a bunch of opinion, and then figure out how to aggregate that back together. The idea is that you can harness in a more frictionless manner.
[25:26] This is the benefit of technology in this space, is that it's hard to get a 2nd, 3rd, 4th, and 5th, and 10th opinion because there's a lot of friction. You got to get a meeting with somebody. You got to sit down with them. You've got to go through it.
[25:37] It takes time, and our view is the less friction you can put on that, getting a 2nd, 4th, and 10th opinion, the more likely you are to do it.
[25:46] The second piece of it is Malcolm Gladwell, he actually popularized these ideas in book, "Blink," and social psychology there's a lot of studies about experts, and how they formulate their judgments, and how long it takes them to formulate a judgment.
[26:01] Often a first‑pass view of an expert is as good as them spending a lot more time on it, so think of it this way, "How about you look at that for 10 hours, and come back to me with your view of what's going to happen here?"
[26:14] That's 10 effort hours. Maybe what I should do is have five people spend 2 hours, and take those same 10 hours and re‑characterize them so that I can get the views of five people instead of one person.
Natalie: [26:26] In your system, how do you decide what somebody should look at? Who decides the base information that should be provided?
Dan: [26:32] That's one of the most important aspects. The more consequential the decision, the more you might want to have, actually, multiple people define the information set, because one of the issues is people can see the set of relevant information differently.
[26:47] Part of what we do now is we don't even keep score. We say, "Well, so‑and‑so said X." We can't really peer into the nature of their judgment. Maybe they write a memo. We can't really understand the way in which they're weighing different aspects of the decision.
[27:01] One of the things that's really interesting about experts is when you do statistical analysis of their judgments, experts are actually typically very good at setting the relevant parameters of a decision. Usually if you get a group of experts together, they've got to define, generally speaking, the set of things to look at.
[27:20] What they're not good at determining is what are the weights that I should apply to the various factors. That's where people really have trouble.
Natalie: [27:28] They're probably coming from, maybe not a personal bias, but certainly a lens or a view based on their own personal experience.
Dan: [27:35] Even when they get it right.
[27:36] [laughter]
Dan: [27:37] Even if you get it right, you can't articulate, "Oh, it's 02 this, it's 07 that, it's 0105 that." Even an expert who's nailing it can't articulate it.
Peter: [27:46] Isn't that what we're trying to force people to do, or what we're asking people to do, is articulate it all the way through, and that shows the transparency of their final prediction or risk assessment?
Dan: [27:57] Even by defining the information set that one should use to render a decision, you're already creating more transparency that exists right now, typically speaking.
Peter: [28:06] No, without question.
Dan: [28:08] That's step one, and then you want to understand, "OK, we had a bunch of experts. They differ." You get their narrative about why they differ, and you're starting to see that there's a disagreement about a factor or a weight on something.
[28:20] Maybe you can track performance over time. Say, "Well, on matters like this, this factor tends to drive the outcomes," and it's not just your single‑person expert view. It's actually demonstrable, and that you see the disagreements.
[28:35] One of the things that I like about crowdsourcing decision‑making is you actually can get a minority report. You can get a pre‑cog that has a different take on the particular matter.
[28:45] If I got 10 people together, 10 expert lawyers, let's say 7 of them have one view of the matter and 3 have a different view. Wouldn't you be interested what the three people had to say? They're smart people, and so they have a take.
[28:57] What if you never even thought of that or considered that because you immediately locked in on a particular perspective? What if those three people have it right, and you didn't even think of it, you didn't even weight that?
[29:09] We talk about risk. You want to understand what alternative perspectives on a problem might look like, but we have not really had a disciplined way of doing that.
[29:17] In fact, lawyers want to go to their clients and typically present a very buttoned‑up, "We've got a handle on this" presentation of the information, but I like to say, "Why is that so important to you?"
[29:30] Why don't you say, "Hey, you gave us what was a challenging case, and we distributed it across our organization, and we have two schools of thought. Seven people think this, three people think this, and I want to explain to why these three people think this, because it's worthy of your consideration."
"[29:45] On balance we think you should still go with the seven, but I want to be fully clear about that these three people think, and if those three people think something that's potentially catastrophic to you, maybe that's a good reason to settle the thing. Maybe it's a good reason they put it to bed."
Natalie: [29:58] Is it a push to get lawyers to move more towards the right outcome in a file, as opposed to consistently advocating a view that perhaps isn't the most reliable or correct?
Dan: [30:12] The problem is that we have difficulty decoupling advocacy from prediction. You have your external presentation of whatever the issue is. Internally, it's hard to switch modes and then provide a neutral and dispassionate presentation of where things really are.
[30:29] The common adage that people say is so‑and‑so fell in love with their case. They thought it was better than it actually was, and it's completely understandable. You start to believe your own [bleep] . That's the problem.
Peter: [30:41] As an advocate, at the end of the day, you almost have to get there, right? I often say around here, "After you've done the analysis, after the decision has been made, after you've seen all the warts on it, when you put your advocate hat on, you do have to believe."
Dan: [30:55] Certainly more convincing if you believe.
Peter: [30:58] [laughs] You have to believe, but that idea of separating those hats and making that clear to lawyers as well as clients is an important thing. But I get a little bit uncomfortable and I don't know how to handle this. I wonder what you think about this.
[31:10] I get a little uncomfortable when we're saying, "We're going to restrict the data set on which the crowd is going to make its determination?" When you say, "Experts see different things within the material itself," I also think different people will want different pieces of information.
[31:26] And so, if we restrict the data set, we say, "OK, let's say these four documents for the sake of simplicity," there are people in the group making their decision that will want to go outside of those four documents or don't want to be restricted in that way. I wonder how that impacts the prediction.
Dan: [31:39] One of the questions we ask is, "What information, if available, would aid this decision?" In other words, soliciting them to tell us other information that might be available.
[31:47] This is what's so helpful about this. I love your question because it really teased this up. What you have to ask yourself now is, "How much are people really going outside the information?"
[31:57] That's a little bit more of a story than a reality. In other words, everybody is going to restrict the information set implicitly. Even if I told you, you had all the information in the world. You don't use all the information in the world. You define the space. You reduce the space to something that's tractable.
[32:15] It's not like everything is up for grabs. Reality is you are going to look at a defined set of information. That defined set can be fairly wide, but it's not as if you're going to truly consider the whole world as possibly impacting the outcome here in whatever the issue is at hand.
[32:31] This is exactly why we want to move to methods because we're being more clear about what the set of information and the basis on which judgment is being rendered, rather than this black box where I have no idea on what basis somebody is actually coming to their decision.
[32:45] You're right in the sense that, if there's this fifth document and it's really important, and you didn't give it to the people, then that's certainly problematic. That would be equally true, however, if I give it to that single expert and the same thing happens.
[32:57] At least in this instance, I've been clear about what information is in the set. I would much rather have my method over a portfolio than the alternative that currently exists. I'll bet my own money on the fact that this is a better way to do it.
Peter: [33:11] There's no argument there at all. My only question is there are times when I'm looking at the information that we've gathered or compiled and we're basing our decisions on and I look at it and I say, "I want somebody to run this analysis and come back to me. I want to add that to the data set."
[33:26] That's why I really liked what you said earlier when part of the questions are, "If could have another piece of information, what would it be?" As long as we're building in that ability or that inquiry, then we've buttoned this up and provided that opportunity.
[33:40] I don't know how far I've gotten into this with you, but so we built something for our practice that I guess parallels LexSemble. And so, we walk our clients through this analysis and use it as a collaborative tool.
[33:51] One of the things that you've keyed on is something that has really impacted us, reinforced our beliefs that this is the path to be on and it's something that you touched on.
[34:00] What we're finding is that the ability to have those conversations with clients are adding things to our analysis or giving things to our clients that we wouldn't have without this structure or without this tool or without this transparency.
[34:13] I look at what we're doing and I think, "We do it so much better now that we have a process or an ability to bring the client into that conversation and have them participate that I don't know how you could have that any other way."
[34:26] I guess there's no question. [laughs] That was a rant but...
Natalie: [34:29] There is no question, right? This is a better way to do it.
Peter: [34:31] The idea that a lawyer would go to a CEO and give them the final results without the analysis or without providing them with the opportunity to engage or to participate in that analysis or to critique it, one, is foolish.
[34:44] But then I think to myself, "Does this fit into this whole idea, again, that lawyers see themselves distinct from non‑lawyers and that's why there's an unwillingness to open this up and to show the CEO or the CFO, 'Look, here's the basis of analysis and your contribution is valued and important to this whole process'?"
Dan: [35:03] They're only your ultimate client so there's no reason that you shouldn't get their take.
[35:07] [laughter]
Dan: [35:07] The thing that, on the CFO side of things where I've spent a little bit of time, is they are desperate to have their lawyers talk and think this way. The lawyers are speaking Mandarin and the CFOs here are speaking English and they're coming from two different worlds in terms of mindset.
[35:23] If I ultimately work for somebody, I would try to make myself look more like them, not the other way around, rather than say, "I'm exceptional and you have to essentially kowtow to my paradigm or my way of thinking about things." Again, it starts really early in training where we push the narrative of lawyer exceptionalism.
[35:42] We say, "This is the way we do things." I always loved the phrase, "This is the way we do things here."
[35:47] If you ever go to an organization of any type and somebody says, "Well, that's just how we do things," that means they have no independent justification for what they're doing. That's the last resort argument somebody puts in front of you.
[35:58] It's like, "Well, that's just how it's done." Why is it done this way? Why? Why would you run things like this?
[36:05] All you have to say is, if you're managing risks, "We have well‑known tools to manage risk." I'd like to say a little bit more about this is idea of Fin Legal Tech is really getting law back on track when it comes to managing enterprise risks.
[36:19] We're fundamentally off track right now. We have a fundamentally unsound way of going about characterizing/predicting and helping organizations manage risks and we've gotten away with it.
Peter: [36:31] That's I guess, and I don't know how much this matters, but I think, "How did we get away with this?"
Dan: [36:35] We said it was complex. We said it's, "Oh, you can't understand this. You're a 'non‑lawyer'."
Peter: [36:40] I'm surprised, I guess, that there hasn't been more pushback before. I guess bringing us to present though, are you hearing or do you see more clients demanding it now or asking for it now or saying, "No, no, no, hold on. This idea that it's too complex for me is just nonsense"?
Dan: [36:54] With LexSemble, which is the product that we built, we are in the process of piloting with a few law firms. In a larger firm, it's going to be a specific lawyer in a specific practice area who is enlightened around these ideas and has an appropriate use case that fits well into this particular paradigm.
[37:13] But what will happen over time is people will see the success. That's my view and will copy it and that's fine by me. It will lead to an exposure of how unseemly things are now.
[37:26] That's why I'm so interested in financialization because let's say that I'm 10 percent better at forecasting events. The problem is now, if you don't have skin the game, you can't actually catch the upside of being 10 percent better.
[37:38] That's why I want people to have some class of contingency. Then you actually say, "That 10 percent difference over a portfolio of matters whether they were regulatory outcomes or case outcomes or transactions or whatever is actually a lot of money."
[37:52] Of course, as I like to say when I go to law firms, "We're not running a 501c3 here. This isn't a nonprofit. This is a conversation about money." I don't think that's inappropriate if we're running a business.
[38:03] The money is what will speak here, and honestly it's what should speak.
Peter: [38:08] That's one of the things that you and I keyed on when we met up during the LegalX Conference after you spoke is when we talked about our firms' findings and our data and all that other stuff. We talked about this idea that one of the things that was really striking to us and smacked us in the face was, if we had more skin in the game, our revenues would have been much higher than they otherwise were.
Natalie: [38:26] You mean if we'd worked on contingency?
Peter: [38:28] Yes, if it wasn't just our reputation and quality of work and all that other good stuff. If we had more contingency or blended contingencies or something like that based on the data that we had, our revenues would have been significantly more than they otherwise were. With small improvements, this could mean more and more.
Dan: [38:44] The problem we've had with innovation in law is you couldn't bootstrap it. In other words, even if you built a better mousetrap, it was hard to monetize that to catch the upside.
[38:53] And so, that's why it's become an obsession of mine is to try to create three essentially back door or front door finance, more explicit honey pots for people to carry off innovations, and then catch the rewards of those innovations.
[39:08] We'll get a lot more innovation in legal if you can monetize it most directly on the outcomes.
Peter: [39:14] One of the other things I'll say, and I hope you take this right way and I think that you will, is that you guys developed LexSemble and it's important that people understand when they hear us talk is I don't think any of this is rocket science.
[39:27] This isn't something that's so out there or difficult to understand, build, or implement or use. This isn't the thing of sci‑fi or fantasy. These are things that have been around in other industries for a very long time.
[39:40] Now, LexSemble's a great example of combing these ideas and existing technology to build something to allow lawyers to increase the accuracy of probabilities, come up with a structure, and monetize it at the end of the day.
Dan: [39:54] I don't take offense. I completely agree. In a sense, we're very up front about this. You could get out a pencil and paper and replicate a lot of what we're doing.
[40:03] You wouldn't want to do it. That adds a lot of friction. It takes a lot of human time to oversee it, but it's not like, at its core, we're doing some magic machine. It's actually very, very straightforward.
Peter: [40:12] That's been exactly our experience as well. We were doing a lot of this analysis initially with a paper and pen and doing it better than most, but not as well as we could with a tool that could help us crunch numbers and run different permutations as we saw fit.
[40:26] Now that we see the power of that, again, with a pretty simple tool relatively, it's amazing the gains in efficiency and the insight that can be gained.
[music]
Peter: [40:34] What did you take away from interview with Dan?
Natalie: [40:40] There were two things that he touched on that, I guess, really spoke to me. The first was the massive weaknesses of this whole cult of a single expertise and the problems with going to the opinions of one lawyer. Then the other thing is the fact that law can be a science.
[40:57] It speaks to me, I guess, from my background. I've got much more a math and science‑y background, engineering, that stuff. When I look at legal problems, particularly litigation analysis, I see it as a mathematical quantifiable issue.
[41:13] To hear someone else talking about that and saying that there are law firms implementing this, that's huge for me.
Peter: [41:20] Did he say that law firms are implementing it?
Natalie: [41:22] He said he's piloting his software in firms. To me, that means they're using something.
Peter: [41:27] Dan is helping law firms initiate this type of thing through LexSemble.
Natalie: [41:30] Absolutely. And not only is he helping, having the Fin Legal Tech Conference and all that stuff, he's drip‑feeding that mindset change that we need to have.
Peter: [41:43] Where are we do you think in that evolution? First of all, do you think he's right? Do you think lawyers are going to adopt a more, I guess, scientific view or...?
Natalie: [41:53] There is no doubt in my mind that that will happen.
Peter: [41:56] Wow. Why? Where do you see it happening now? It's us and Dan.
Natalie: [42:00] [laughs] I don't think that's bad.
Peter: [42:02] He talks about the litigation financing. We talked about Premonition before.
Natalie: [42:08] It's coming from both ends. You've got people who are quantifying existing case load and they're finding the data set that we have.
[42:14] Then you've got people who are coming at it from the actual analysis angle who are saying, "Now, we have a data set. And now that we can sit down and have a mechanism to not put pen to paper but the equivalent and start recording all of this."
[42:27] You're going to get the squish in the middle. Add to that the pressure from the client to give you a straight answer about, "What are my chances of success?" All three of those things will converge with the outcome that I want.
Peter: [42:39] What else did you learn from Dan?
Natalie: [42:43] When he was talking about the cult of a single expert, one of the things that flagged in my mind is, "Does that position a larger law firm to do this better than a smaller law firm because they inherently have more experts?"
[42:55] If and when they ever manage to, I guess, progress to the point where they are crowdsourcing judgments and predictions and things like that, they've got a bigger set of experts than a smaller firm. They might be better positioned to do it.
Peter: [43:09] It's an interesting thought. I don't know. I bring it to our practice area, tax litigation, and I think to myself, "Well, even if you're at a big law firm, how many tax litigation departments are bigger than our law firm anyway," right?
Peter: [43:21] At what point is enough enough? Do you take as many experts as you can and continue to feed them in? I guess the answer to that is yes.
[43:32] But the real interesting step is figuring out how to do this outside of law firms. The first step is agreeing that they're ought to be a methodology, even for the single expert because we're not even there yet.
[43:50] Then saying, "OK, now we have a single expert following this methodology. We've improved the current state of affairs, and now let's get more people from inside the firm to record or contribute to the prediction."
[44:06] The real big step that happens is when that system allows us to go outside of the law firm and take into account individuals who are not part of that, I guess, silo. I believe that, if you can achieve that, then you're getting real independent thought.
[44:21] He talked a little bit about the idea of how decisions are made now or how probabilities are arrived at now or how legal analysis is done now. Everybody piles into a board room and you have the senior partner say, "This is how I view the matter," and then everybody falls in line, either consciously or subconsciously in some circumstances.
[44:40] That's always going to happen to some extent if you're constraining the analysis to one organization itself. And you can make larger end roads if you can break out of that even further.
Natalie: [44:50] But the idea is you wouldn't know what other people have said already. But you're saying, I guess, informal discussions would have taken place prior to even filling out your probabilities.
Peter: [45:00] Yeah, you're working on a file together. The impact is that we all start to see things through a similar lens. The extent that you can break away from that, the better the analysis and predictions will likely be.
Natalie: [45:16] You'll get more of those, you mentioned the pre‑cog minority report. What you're saying is that's more likely to come from outside of a firm.
Peter: [45:25] Yeah. If you look at we published the countermeasure article on our blog, and in which we talked about our system that we've built to do this. One of the things that we mentioned in the blog post was The Good Judgment Project. That really influenced how we looked at countermeasure and how we built countermeasure.
[45:44] What you start to realize when you look at prediction or when you start to track prediction and who's good at it and who's not, and The Good Judgment Project is a great example of that. What they found is many times the people that you think have the ability to predict more accurately are oftentimes not the ones.
[46:05] And so, anytime you can get to those outliers or unique points of views, and then identifying them as highly accurate predictors, first, that's going to be really surprising to many people. Second, again, that could raise the level of analysis and maybe identify people that are quite good at prediction who are outside the organization.
Natalie: [46:24] It'd also be really good, within the organization, if you're a junior associate or even if you're a student of an organization. This is a perfect mechanism for you to show your expertise. If you're one of the ones who's logged as accurate predictions all over the place, you become a rising star that perhaps you wouldn't otherwise have been noticed as easily without this.
Peter: [46:47] One of the points that Dan made is it doesn't take a lot of time, in 10 hours being allocated to analysis and prediction across two people, five hours each, as opposed to five people with two hours each, and when that understanding comes into play. Then you can canvas juniors who are maybe not involved in the file or other lawyers who are not involved in the file.
[47:10] We had a file recently that Yoni and I were on. We brought you in and the probabilities that you set for different aspects of the file were very different than Yoni and I who were knee‑deep in the file for many, many months. At the end of the day, you were right and your predictions were more accurate.
Natalie: [47:29] Quick. Write it down. I was right.
[47:32] [laughter]
Peter: [47:32] Lucky for us, I started tracking these types of things or else.
Natalie: [47:34] But it is tough to track.
Peter: [47:36] In the current state, it's tough to track.
Natalie: [47:38] We...
Peter: [47:38] Nobody tracks it. Nobody even attempts it.
Natalie: [47:40] It requires attention like anything else, right? What I mean is it's hard to learn how to put a number on something.
Peter: [47:49] This came out in the podcast and I said it, and I think Dan said it as well, is that this isn't hard. It's a matter of starting. It's a way of thinking and it's a discipline.
[48:03] If you can combine those two things with a system that allows you or that helps you do it with less friction, this isn't difficult. We've seen, and Dan can attest to the fact that, the upside is significant.
[48:16] The point of all this though, and I love the fact that he made this point, is that not only is this the right way to conduct this type of analysis for any number of reasons, but one of those reasons is financial to the law firm itself. If you can get better at assigning more accurate probabilities, this could have huge positive financial ramifications for the law firm.
[48:43] If nothing else, if doing the right thing for the sake of doing the right thing isn't enough of an impetus to start, I love the fact that Dan says quite candidly, "This will put money in people's pockets."
[48:55] Because if you can be 10 percent more accurate than your competitors or than you are today and you have some skin in the game, that could put a significant amount of money into that law firm's practice.
[49:08] People hear all this and some of them are going to dismiss it, I guess. The people that understand and want to start to take steps in this direction, what should they do? Where do you start?
Natalie: [49:22] It depends on how many people are involved. If you're in a larger firm, you do need a tool to help you because, otherwise, the implementation would be difficult. At that point, I'd be calling Dan.
[49:33] If you're in a smaller firm, you could start in something as simple as Excel or a Word document or something like that. I would say, it depends on the number of people.
Peter: [49:43] Dan's looking for beta testers, right?
Natalie: [49:47] Absolutely.
Peter: [49:47] And so here's a circumstance in which somebody built the tool. We went through this whole podcast. He explained that he had something called LexSemble. We didn't dive deep into it because we were more, I guess, dealing with the higher‑level issues.
[50:02] We'll put LexSemble as a link in the show notes. We would encourage anybody to go take their own look, but what is LexSemble?
Natalie: [50:09] It's a platform that allows you to crowd source and aggregate predictions from across your organization.
Peter: [50:15] Basically, this is all about leveraging human capital in a way that's not being done now. It's about systematizing an analysis and providing a framework for a more rigorous analysis which is not happening now.
[50:30] The second thing it's doing is it's leveraging a crowd of experts to increase the accuracy of those predictions.
Natalie: [50:38] Then the third thing they're adding on as well is it will also give its own prediction based on what it's seen so far, so looking at case history, judgments, outcomes.
[50:49] As they expand that data set, both within a firm and from external data, you'll also get to a position where the machine itself, so LexSemble itself, would have its own prediction entered into the rest of it and weighted with the lawyers as well.
[51:03] It's humans and machines by the end of it.
[51:06] When I see this moved towards transparency and I know it's coming in every industry, it's coming around the world. Everything is moving towards being more transparent.
[51:14] It would be really nice if lawyers and the legal industry could push this to the front of our industry and be leaders in making things more transparent. Is that unrealistic?
Peter: [51:26] If we bring this to the forefront of our industry now, it seems like implied what you're saying is then we should be proud of the fact that we did this?
Natalie: [51:34] Absolutely.
Peter: [51:34] Where have we been for the last hundred years that we haven't done this to this point?
Natalie: [51:40] In a cave.
Peter: [51:40] Now Dan and others like him are dragging most lawyers kicking and screaming in this direction, and by the way, I don't know who's paying attention yet. But they're dragging us [laughs] there now.
[51:50] We willingly will get dragged to end up in the place that law should have started at, which is having a more scientific view of things. Now we're going to get kudos? No. This should have been done forever ago.
Natalie: [52:04] But it wasn't, and if we can make leaps and bounds now, I think those should be recognized.
Peter: [52:09] Great.
Natalie: [52:09] Perfect inspiration. Come on.
[52:11] [laughter]
Peter: [52:12] We'll have some lawyers giving lawyers award ceremony.
Natalie: [52:16] Yeah, really.
[music]
Natalie: [52:17] Don't forget to stick around until the end of the podcast to hear a listener's I Am Building NewLaw story. For this episode's show notes and transcript and how to satisfy your Law Society CPD requirement please visit our website at BuildingNewLaw.ca.
[52:34] We'd love to hear from you and if you have any feedback feel free to send an email to Info@BuildingNewLaw.ca or come and find us on Twitter @BuildingNewLaw. Don't forget to subscribe on iTunes, our website, or wherever else you get your podcasts.
[music]
Peter: [52:51] Thanks for sticking around. Now, one of our listeners shares how they are building NewLaw.
Tori Everly: [52:57] Hi, I'm Tamara Eberle. I'm one of the founders of Traction Strategy. Traction is an award winning professional facilitation company that specializes in innovation and creative problem solving, strategic planning, and change leadership.
[53:11] Think of us a bit like an orchestra conductor. Our clients call us when they need clarity and focus and to get everyone onto the same page, playing to the same score. We work across sectors, but have found that the legal sector is a pretty exciting place to be working in right now. There's so many people interested in planning for what is shaping up to be a game changing technology filled future.
[53:41] People want to innovate and differentiate themselves in the market to stay competitive. Our role is to help companies and teams bring together their collective knowledge and expertise by planning, strategizing, innovating, and implementing change without too many bumps along the way.
[53:57] We do this by custom designing and leading collaborative and engaging events and developing tools that stimulate thinking and activate ideas.
[54:00] We have a very unique tool box of processes that we've gathered from all over the world that can help teams solve wicked problems and keep people engaged. We conduct innovation labs and use game‑based tools, creative and critical thinking techniques, and design thinking and prototyping processes.
[54:18] In our experience, one of the most important elements in staying competitive in this changing context is the openness for true creative thinking and taking a bit of risk as well.
[54:30] While change isn't easy, it is worth being an unconventional thinker, and we'd be really happy to help you out with that. You can visit us at tractionstrategy.ca or feel free to email me at tamara@tractionstrategy.ca.
Sponsor: [54:46] Thanks for listening to the "Building NewLaw Podcast" brought to you by Counter Tax Lawyers. To learn more about Counter, go to countertax.ca.
[music]
Lawyers that have completed the S02E05 BNL CPD can claim a
- To access the S02E05 verification examination click this link.