That Thing You Do: Early Thoughts On AI

There’s a scene in Apollo 13 where Kevin Bacon needs to do some calculations. And he’s exhausted and under a lot of stress because, you know, he’s in a broken tin can a zillion miles from earth floating around with Tom Hanks and Bill Paxton. So he asks Mission Control in Houston if they can verify his math. And Mission Control says “sure, not a problem”, and then the camera turns to a row of math nerds with pencils who are going to run the numbers by hand, and then compare notes. That was how they did it in the days before calculators, in the days when computers filled a whole room and couldn’t be bothered to work on, you know making sure Kevin Bacon was toting up his figures properly. Four crew-cutted nerds with #2’s.

And every time I see that scene, once I get past the sort of archaic lunacy of it, I think “Really? That’s what those guys were there for? To do math? Couldn’t they have been doing something more, I dunno, important? Like maybe figuring out why the CSM LiOH canister and the LEM canister weren’t the same shape or something?

I’ve been thinking about all that a lot as I listen to everyone talk about AI and ChatGPT.

For most of our time on this planet, the only machines humans had were, well, humans. And yeah, the human body is great – it can do a lot of things. So if you don’t have a truck, well, you’re the machine that’s gonna get the load to market. If you don’t have a backhoe, you’re the machine that’s gonna dig the grave. And if you don’t have a calculator, you’re the machine that’s gonna run the numbers (crewcuts optional).

But like most machines that can do a lot of things, the trade-off is, it can’t do any one of those things exceptionally. Because it’s built for diversity, not specialization. A truck can make fewer trips than a human can, a backhoe can dig the hole faster, a calculator can run the numbers with fewer mistakes. But a calculator is less capable of dragging a load to market than you are and a truck is fairly useless where running the numbers is concerned. The human body can do all those things – not A+ perfect, but better than, you know, not at all. Which is the alternative.

When we look at AI and ChatGPT and all the others that have come out since I started writing this essay, it should be in that context: what have we been mediocre at that this new technology can free us from doing a mediocre job of, so we can focus on something we’re actually good at, indeed, better at than machines? As my buddy Howard McCabe asked, can it scan reams and reams of code for bugs faster and more thoroughly than a human can? Yep. And if it does, does that free up a human to think more deeply about what humans would really want that code to do and how they might use it? Yes it can. Because it can help us by doing better than us the things we are not built to do well. So why wouldn’t we want that?

But here’s what it can’t do. It can’t make quantity equal quality.

For while I think there are opportunities for it to free us up to do better work, I am concerned that we are falling into a trap that is rampant in advertising generally. Namely that more = more effective. Which, you know, no.

The fact is that more of what i don’t care about doesn’t make me care about it. More of what I don’t want, doesn’t make me want it. More is just noise, static, interference. More is just the stuff that actually gets in the way of the stuff that I do want cutting through. More is why people hate advertising (well, one of the reasons).

But “more” is the last refuge – well, the first refuge – of advertisers who are either too lazy or too stupid to really think about their customers. “More” is the strategy of marketers who don’t think their customers matter, or more dangerously, don’t think their own products matter, and so haven’t taken the time to find that unique quality, that unique difference, that unique thing that customers are missing and desiring that their product can provide, in order to really make a connection. They just say “What I say isn’t important - if that’s where my people are, that’s where I’m going to be too”. Well, yeah, pal but there were a lot of people at the Lizzo concert too, but 99.99% of them were only paying attention to one person.

“Just showing up” (as I have written elsewhere) is not a brand strategy, but a lot of what we are hearing right now is that AI and ChatGPT are the future of advertising because they will generate exponentially more content, which will let brands “just show up” an order of magnitude more than they do now. And agencies will likely fall for this because, well, there are a lot more bad, lazy and stupid ones than there are good ones. And this will undoubtedly elevate the public’s already keen ability to ignore the ads they see, and accelerate the development, use, and effectiveness of ad blockers and other devices that basically say, “oh no you don’t”. All of which will make what we do less effective.

So what do we do? Because if we’ve learned anything in advertising over the past hundred years it’s that anyone who bets against the technology will lose.

What we do is what smart agencies and smart clients have always done when faced with a cosmic leap in technology: use it with insight and imagination (often another way of saying “creativity”) to make work that people actually care about. That they think about when all those other things they don’t care about are avalanching them. It’s as simple – and as difficult – as that.

Who said advertising wasn’t rocket science?