Skip to main content


The Accounting Technology Lab Podcast – How to Determine Valid Content on AI – Nov. 2023

Technologists Randy Johnston and Brian Tankersley, CPA, give advice about how firms that are adopting AI technologies can determine the difference between real and fake data.

Technologists Randy Johnston and Brian Tankersley, CPA, give advice about how firms that are adopting AI technologies can determine the difference between real and fake data.

Use the podcast player below to listen.

Transcript (Note: There may be typos due to automated transcription errors. Also, due to the intro to the podcast, add apx. 10 seconds to the time stamps below.)


Randy Johnston, Brian F. Tankersley, CPA.CITP, CGMA.

Randy Johnston  00:03

What kind of the Technology Lab? I’m Randy Johnston, and I have my co host, Brian Tankersley. With me, we’d like to talk to you today about how to discern valid content. And for that matter, am I really Randy Johnston? And by the way, are you really Brian Tankersley? How can we be sure, because in today’s world of deep fake video, and voice, and all sorts of other things, we think it’s going to be much harder to tell that you’re really looking at the real deal. And it doesn’t matter whether you’re on video or audio, in a phone call where you’re trying to confirm a wire transfer. Gosh, who are we? And how do we know for sure. Now, by the way, we’re not smoking dope. We’re really concerned about this topic for you. And Brian and I have done work in this area. And we’d like to just address some of this. So Brian, I think, you know, the How to Geek you know, piece was actually very interesting. So you want to give our listeners some background on this.

Brian F. Tankersley, CPA.CITP, CGMA  01:14

So there is a there is actually going back, going back over two years, there was an article on how to Geek titled How deep fakes are powering a new type of cybercrime. And, and what we’re seeing here is that what a deep fake is for those who never heard of it. This is where a this is where we use AI to take existing photos and video and audio of somebody doing something. And then we create fake stuff that looks like that. And so I actually have a session that I do during our tech conferences this year, where, where I show some photos that appear to be that were actually circulated right around the time of President Trump’s first indictment that actually showed photos of him allegedly resisting arrest that had been circulated on on fake news sites, actually showed him trying to be arrested by by a number of police said it was completely fake thing. I also showed a video of Prime Minister Trudeau reading from Canada who has been fairly controversial during his term, reading a book that is that was supported by the truckers who protested against his against some of his actions that that endangered their livelihood during the pandemic. And so, you know, again, this has been used, I’m pointing this out to you here because Prime Minister Trudeau obviously is on the left side of the aisle and Prime Minister Trump’s on the road, you see me President Trump’s on the right side of the aisle, you can see here that this has been used against gets targets on both sides. And I think with the election coming, we have to realize and acknowledge that just like just like we go through and and we have to worry about fake invoices. I think we have to start worrying about now, fake video calls and fake audio calls with people simply because the real time translation and the content generation capabilities now are beyond what you may what you may have now, in the bank. There is actually a post in the finance blog from binance binance coin and which is a I guess it’s a crypto organization. They actually have a coin called by Nance coin. That is a that is a cryptocurrency I think it’s a stable coin, if I remember right. And then they also have a, a crypto exchange. But their chief communication officers, Patrick Hillman, actually posted an article to his blog saying that he had he he allegedly virtually attended a meeting. And he never attended it there and he actually had been impersonated by AI. And so the scammers actually created a deep fake of him from the audio and video captured in a single Zoom video meeting and turned that into a into a deep fake model of him. We’ve also seen incidents where where, where CEOs have called to trying to confirm that that people just confirmed that a wire transfer has been ordered. And somebody has has picked up their their accent and the inflections in their voice and all kinds of other things and then built that into a deep fake model. And then has actually then those have actually been used there. For this now the there so so again, we have to be very careful about are especially about our wire transfer verification protocols and other things like that. Because, you know, my wife works at a bank. And she’s telling me that this is really this kind of stuff is really taking the business email compromises and the wire transfer scams that we’ve had for decades, really to the next level.

Randy Johnston  05:23

And Brian, you know, along this line, obviously, with bar podcasts, there’s ample copies of our voices in public circulation. And, you know, if you think about these tools, where you can pick up pictures or video, or voice recordings, and the tools will recreate the voice. Now, as you know, we’ve been celebrating mom’s 100th birthday. And because of that I have, I’ll call it oddities in my voice, because she’s, you know, an Australian immigrant, a war bride. And in fact, you know, me well enough that every once in a while I’ll cut off with what would be considered bad language in Australia, you know, get rid of the bloody thing, or I don’t say it as much anymore, but pound and a half. And, you know, it’s like, Where’d that come from, or once in a while, I’ll talk about the boot. And I’m not talking about a shoe, I’m talking about the back end of a car, which we’d call a trunk, right. But, you know, I picked up all that in my language. And it turns out that in my writing styles, and my language, there are idiosyncrasies, in the way I talk, the way I speak, if you will, the way I write. And what we’re warning you about friends, is these deep fakes are good enough now, that in real time, they can take a bad actor saying whatever they want to say. And it comes out sounding like Randy O’Brien,

Brian F. Tankersley, CPA.CITP, CGMA  06:54

when I will. And I will tell you that as somebody that moved from rural Tennessee to Northern California, in the mid 80s, I can tell you that people were fascinated with the idioms that I used like someone that somebody thought a lot of themselves, I said, he thought he thinks he hugged them, and other other statements like that. And so you know, this, these deep fakes can pick up on those idioms Randy’s talking about and the inflections and the terms that are used, and then they can, they can take a statement that you’re going to make, and they’re smart enough to be able to, to make the changes. So So what actually happens here is that we take it, we take a decode, we take an encoder to take your face or your audio, and then to create kind of a crypt of it. And then we create a decoder that will take the input and then reconstruct it, reconstruct it in a in a pattern. So what happens then is we, we take two phases, phase A and phase B, and we create encoders, and decoders for each one. And then we turn around and we take the original encoder data from date from face a, and then we, we apply it to with the decoder for face B. And then we basically can reconstruct a fake a, you know, a fake audio set, and a fake video set from this. And it’s actually pretty remarkable. I actually looked at this and I said, No, there’s no way, you know, and, and I said, yeah, there’s just no way this possible. And so I decided that I would throw some money at it. And I went out to too deep, which is aside, I uploaded a single photo of Randy and a single photo of me. And then I cut the title sequence out of the Blues Brothers. And I replaced Jake’s face with my face, and Elle Woods face with Randy’s face. And it actually worked amazingly well. And so I’m mentioning this to you here, because Google has actually gone through. And they actually announced yesterday that they are going to require disclosures by the by advertisers during this next political season, when they’re using deep fake content, or they’re using AI generated content that is not actually legitimate. So I mentioned that to you here because I think it’s like it’s very important that we have to be very careful with wire transfers. And with these business, email compromises and other things like this, because I think the days of being able to spot the, the deep fake emails and the deep fake voice calls and other things like that are coming to a close, you know, I’ve received in the last 48 hours, three or four text messages that all looked pretty legitimate until I went through and looked at the URLs that it was directing me to and I knew that you know, the Postal Service or UPS or whoever I just wouldn’t ever use that, that domain that website to do that. So I think I think it’s very important that we all you know, and I’ll use one of my favorite southern phrases, kind of get our dander up a little bit. And, and get a little bit excited about this, and watch things much more critically going forward than we have in the past, just because the simplicity of creating this stuff, you know, if I’m from Tennessee, which is number 49, and education really, and I can, you know, I’m a Public School graduate from the 80s. And I can come through and, and end up with, you know, creating a deep fake and under an hour, I assure you that the bad guys that have more resources and more motivation are changing, chasing hundreds of 1000s of dollars, can do much more with it.

Randy Johnston  10:54

Yeah, so, you know, I was so happy to be part of the Blues Brothers with Brian. But Brian, I’m just gonna say bless your ever loving heart. And, and our listeners, some will know what we meant by that. But you know, this issue of the fakes is very, very real. And what I’m concerned is that your clients will fall for these scams, or that you will fall for the scams. So I’m a very trusting soul, as most people around me know. But, you know, the bad guys don’t seem to have a conscious on this, actually, they kind of do. They just want to go where the money’s at. So this is the latest way to commit fraud. And it’s going to be committed in the political cycle. As you’ve mentioned, Brian, you know, I think this is the cycle of deep fake political ads, just like we had the cycle of social media, you know, in the Obama election, for example, and, and so forth, each cycle, the pull of political people take some new tool and leverage it to their advantage. Well, the bad actors are doing the same thing. They’re taking a new tool, and they’re leveraging it. You know, I’m saying that Brian, it used to be the porn guys that did this the most, you know, they take the latest technology tools and leverage them. Well, you

Brian F. Tankersley, CPA.CITP, CGMA  12:31

know, I’m not, I’ve been married for 30 years. So I’m not a consumer of that content. But I have read on wired and other places, that that is actually happening to great effect today. And for 15 years, my wife, who’s a who is, is a banker, now is formerly a social worker, she actually has told me that you shouldn’t post pictures of your children online, because people that supposedly pedophiles are taking those photos and actually turning them into all kinds of odious content. And unfortunately, since it’s aI generated content, it’s not as I guess it’s it’s not as subject to the laws prohibiting kiddie porn. And so it’s kind of a loophole in the statute. So it’s a very unfortunate that, that that’s taking place. But I would just say that, you know, this, you know, the bad guys, again, are adopting this. And they’re using it for all kinds of nefarious purposes that you can’t imagine and you need to be on guard for it.

Randy Johnston  13:39

Well, I’m glad I know so little about that. Because I like you have been married for right on 50 years. But you know, the bottom line to this is there, it seems to me, there is no limit to the links that the bad guys we’ll go to. So these deep fakes are getting better than you think. And they’re getting better, daily. And as CPAs, you’re most likely high profile in your community. And just recognize that if you’ve got a few video, webcasts, podcasts, articles that have been written about you, a conference call is done on teams zoom GoToMeeting, and there’s an audio recording of you. It’s not very difficult to make a deep fake model to impersonate you. So just be super aware that these tools have become easy enough to use that. It’s pretty stunning. Well, well, Brian, I know we could keep going on and on about this. Do you have other key closing thoughts for our listeners

Brian F. Tankersley, CPA.CITP, CGMA  14:55

the deep fakes are gonna get better. Again, it’s not difficult at all to true In the new deep fake model, and again, the more content that’s out there, the more likely that this is going to get generated in here. So I just want you to be aware of this. And you may want to come in and have different validation protocols in there. You know, I’m not trying to make you into secret agents, but there may be certain words that you say, or certain codes that you say, or questions that you have as part of your validation. So for example, we’re ending, you might ask me about how my younger sisters do it? Well, I don’t have a younger sister. But that might be the tail in a conversation to authorize a wire, that, that that would be that, yes, I’m really authorizing this, because there is no younger sister. So again, I’m not trying to get you into some kind of spy tradecraft here. But I do think that we have to look at those wire authorization procedures. And we have to look at the evidence that we get today, much more critically, because of the deep fakes and the, the AI generated content. You know, honestly, if we can generate a, a fake version of a, of a YouTube video, or have a full feature movie, with some of these tools, you know, generating, generating fake invoices, kind of the way the Z best scandal did back in the 80s is old hat.

Randy Johnston  16:24

Yeah. And you know, as I’m listening to you, I’m thinking about revisions I want to make to intro internal control procedure, pieces for both accounting and for it, because I realize, we probably need to be proactively preventing some of these things and have new procedures in place. And I think so many of our control procedures are weak at best in small businesses. And this probably needs some pretty extensive updating, I hadn’t really wrapped my mind around it until we were discussing it today.

Brian F. Tankersley, CPA.CITP, CGMA  17:03

Well, we thank you for your time for being with us today, folks, you know, I’m, I appreciate the opportunity to be with you and to talk about these things. And I know Randy does, too. And it’s a, you know, we’re very excited about, about the changes that are happening in the world of technology. And we appreciate the opportunity to have you join us and to listen to our discussions about how these emerging technologies actually affect you in the real world that we all live in.

Randy Johnston  17:35

Yeah, and you know, I’ve said this in other tech labs with you, all technology can be used for good and bad and it makes me sad when I see technology used for bad when there is so much that can be done with the effort to to do good. But, you know, maybe that’s the Pollyanna part of me, you know in play, but you know what, I sleep well at night and like it, and I hope you get to as well. Do keep your wits about you when you’re trying to discern valid data and potential deep fakes will talk with you again in another technology lab. Good day.

== END ==