July 17, 2024

Why AI Won’t Fix Hiring

Hosts:

Episode Highlights

Why Human Intuition Outperforms AI In Hiring

I
4:01

Not Every Recruiting Problem Can Be Solved With Data (Or AI)

I
6:17

If You Rely On AI, You'll Stay Dumb

I
11:25

Subscribe to the Talent Insights podcast on Apple Podcasts, Google Podcasts, (recommended for Android users), Amazon Music, or Spotify. Watch us on YouTube—and don’t forget to rate us!

Another day, another AI tool that promises to change the world. But won’t.

It’s not a big leap to think we’re at the peak of the AI hype cycle. But in recruitment – a sector already trailing the technology curve – what if it actually makes things worse?

Who is going to train AI on a good interview process if there isn’t already a good interview process? What happens when an AI bot breaks down and no one knows how to fix it? And why would we think taking more human involvement out of an experience often lacking human touch will be a good thing?

James Hornick and Jeff Smith break down why we shouldn’t expect a recruiting cheat code in episode 92 of The 10 Minute Talent Rant, “Why AI Won’t Fix Hiring.

Episode Transcript

The 10 Minute Talent Rant is live. I’m James Hornick joined by Jeff Smith, and we are on the clock. The 10 Minute Talent Rant is our ongoing series where we break down things that are broken in the talent acquisition and hiring space. Maybe even pitch a solution or two. Before we dig in, all of our content can be found at talentinsights.hirewell.com.

This week’s topic Jeff, you ready for episode 92? Yes. Why AI won’t fix hiring. Yes. It’s been a real slow news week so I think- nothing happening in the world. No. Everybody’s really, really thinking about AI right now. Yeah. So I’m going to take a dump on AI today. It should not be shocking to anyone.

And I do this knowing there’s a whole world out there, pro AI hucksters ready to disagree with every point I make without being able to back anything up because they’re just going to like ask chat GPT for a response, which literally happened twice last time I’ve talked about AI, so. Not to mention that there is the inverse reality as well, which we’re going to get into.

There’s certainly utility to it. But anyway. Just start Googling how many AI recruitment tools are and you can replace recruitment with anything, at this point. Mostly bullshit. I mean, everybody is saying that they’ve come up with some new ideas of what- in reality, specific to recruitment, they all just rely on keyword matching.

It’s all top of funnel focused. And in our opinion, the solution is just, it’s been cooked for a good long time now. Yeah. It all leans into, I think the perception of people who don’t understand recruiting, they think, oh, you just need more people. So we’ll make finding more people top of funnel faster.

Like it’s absolutely not the problem with hiring. That’s the easiest part we have down. But anyways. It’s the more complex things that are why hiring can be a drag and why things get broken, why everyone has a poor experience out there. So there was, I did a little more research on this. I love geeking out on nerdy stuff.

So Polanyi’s paradox, here’s a new one. Named after the British Hungarian philosopher, Michael Polanyi. It’s a theory that human knowledge of how the world functions and our own capability are to an extent beyond our explicit understanding. We cannot solve every problem, therefore a computer cannot solve every problem because everything in the world is too complex for us to understand.

If it were doable, we’ve already would have been able to predict the stock market and every other and predict weather patterns and all these things we just, no matter how more advanced we get scientific and mathematically, we can’t do because there’s this far more out there that’s just beyond our understanding.

I just, I will take this moment to do a Jurassic Park plug in because I love doing it, but it just reminds me of Dr. Malcolm’s chaos theory. The unexplained, right? Like, we don’t know about the trickiest parts of the hiring process because they all intrinsically involve human decision making and unknowns.

And if you go even deeper unknown unknowns to our boy Rum. Good old Rumsfeld, RIP. Oh Rumsfeld. Yeah. But it makes sense sometimes. Nothing is going to change the fact that deciding on one’s job or career, it still remains one of the two, three, four, whatever the number is, one of the three core big decisions of your life.

So you think selecting a partner and deciding with that partner whether or not you’re going to have or raise kids. And then secondly, where do you choose to live? What is your dwelling? Like to make it very linear. Anyone who has gone through any of those decisions knows that rational thought like a lot of the times flies right out the window.

So I’ll give you kind of a more, we’ll go with an example. I got offer time. People are going to make up their minds on if they’re going to take an offer or not any way they want to. The best recruiters in the world, like they have some learned intuition, but none of this can be broken down in an algorithm.

So like you compare that to like Instagram, which is feeding you ads. Many of which are still irrelevant. Many stuff you already looked at and you decide you already bought are not going to buy. They’re doing this based on countless data points they’ve been able to collect on you, which they then fed into their AI algorithms.

But you literally will never have that data on every person who comes into your hiring funnel. You can’t. It’s not findable or searchable. It’s all in their own head. And everyone out there is, they’re all different. We’re all making decisions differently. Not to mention that, but you also don’t know what they’re not telling you.

Like what other, whatever interviews they may have, how they rank things. Again, like good recruiters can get some of this detail, but you’re never going to know all of it. The only way to gain insights from candidates is to get them to trust you. You have to be real, credible, human, and actual, like establishing a human bonding connections.

Like AI can’t, literally cannot do that. No. Like it’s one thing to algorithmically entice somebody to make like a $35 purchase. It’s another thing entirely to convince that same person to take a job, for instance. Or even less granularly to even consider talking about a job. Yeah. Okay. So just a quick list of things that we both agree

AI does great, tons of use, tons of useful utility, creates templates for customizations. You want a job description for a product manager, chat GPT that thing, and add some context to it about your culture, et cetera, done. You got to get a job description. You can parse resumes better. You can combine the list for public and private usage.

You can do, as we described before, you can pinpoint source with less structured data. But again, still reliant on keyword searches. You can auto update records. You can develop automations and use natural language processing. And there’s going to be a caveat here, you know, deeper into the rant.

But those are all useful use cases for AI. Yeah. Point is like AI is a positive and there’s countless ways that it will save people time but they are all based on productivity, which is not the problem in hiring, whatsoever. Freeing up recruiters to get more stuff done absolutely does not mean they’re going to do things better or the process is going to be fixed.

They’re just going to be taking more- 100%. They’re just going to take more laps around the same hamster wheel of broken interview and hiring processes. The effectiveness of data, I’m sorry, the effectiveness of AI, it’s entirely tied to data. So, like data sets, like list of candidates for example, or metrics around successes, or getting outside of hiring, employment data, or AI writing copy. The ability to combine and merge those data sets into something useful

faster than a human could. It’s stuff that’s quantifiable. So there has to be like a data and quantifiable element to it for any of this stuff to work. Exactly. The things that it misses and I’ll get to the problem, is you go to market and it’s like okay, well the data suggests that there’s a thousand people that are relevant to this job description.

What it’s missing is at any given moment, that individual’s brain, those thousand individual brains, like 95 percent of them aren’t even looking for a job. Yeah. It’s total straw man, which we’ll get to. The main thing here is not everything can be solved with data. Some things like nuanced human relationship shit, like tribal stuff, like instinctual stuff,

there is no data available. It comes down to a quantitative versus qualitative issue. Now, the biggest criticism I always get every time I talk about AI and take a dump on it, and after admitting that it has some very valid things, it’s always the well, we’re not there yet. We’re on that path. Which is a complete straw man argument, kind of like you said. Look, we’re going to get there someday.

You just have to give it time. Jennifer Lawrence Giff. Yeah, sure. Okay. There’s a difference between, and so it comes down to like okay, where’s the time and effort you’re going to put it? There’s a difference in creating new AI products and improving what’s perceived to be non revenue generating processes. So there’s a financial incentive to throw a lot of money in research development at one, because it’s something you can sell versus another. Yep.

It’s why AI spend and development is always going to be allocated to revenue generating activities in any organization. We firmly believe that, that definition should also include recruiting, i. e. recruiting is revenue generating, but history tells us that’s not the case. So barring some seismic shift in our collective office dork behavior, human capital is going to be continued to be viewed as this endless, limitless resource, much like fossil fuel. See what I did there? So one other thing that we wanted to touch on and it’s kind of relevant to that last point is this idea, there’s another paradox that we’re interested in, called the automation paradox. You touched on it a little bit in some posts a couple of weeks ago. And I think we touched on it in our last rant.

This is the phenomenon where increasing levels of automation in a system can actually increase the need for human intervention rather than decrease it, which obviously is the complete opposite of what you want to do. Yeah. Think back to the industrial revolution. Machines were supposed to take away human jobs.

Like that was the big fear and everything at the time. But it turns out we needed people to build, fix, maintain, recalibrate, improve, most importantly, build brand new and think of new machines. Same thing happened when computers became a thing, when personal computers, all of a sudden there was a whole like industry for software engineers, everything else. We need more software and more hardware to kind of create better and better stuff. Now, the thing is like when automations work,

I guess, like it’s seamless. When automation’s working properly, like you never know it was, you never really know what’s automated. But thing is when automations get broken, when stuff actually breaks down and it requires like a human to fix it, it sticks out like a sore thumb.

Yeah. Just think of how many times you’re like, this process sucks. Like all of us know that. Those annoying drip campaigns, you know, stuff you’ll never buy, the LinkedIn connect requests, like “Noticed your profile and saw we know something in common” or just straight up getting ghosted after an interview. All of that is relevant to this.

And it’s just like gross, right? AI involvement without a human who actually knows what they’re doing to course correct things, in those exact instances, they make it worse. Yeah. Just it gets further off the rails. You can’t think like AI is going to fix everything if you’re the person who’s also complaining about how much spam you’re seeing in your LinkedIn inbox.

And if you’re the recipient of those things, you actually, like you foster these negative connotations to the actual human that is supposedly sending it to, which is again, counterintuitive to the whole idea of it. Yeah. There’s one other thing I wanted to get into kind of on this.

I read a really interesting blog post by Jack Rains. If you haven’t, Google the guy, he’s great. Funny, snarky, makes very good points. That’s why I like him as a writer. Someone was asking him like, does he use AI? Like someone was actually pitching to him, you should take all your blog posts and put them through AI and just have it write new stuff.

And he is like, it’s a terrible idea for various reasons. He admitted that he does it to clean up, maybe find synonyms or like awkward like find gaps and kind of stuff he was talking about. But what people mistake with AI, what they don’t realize about writing specifically, and I talk about this because right now, gender of AI is the most, like biggest application.

So it’s very kind of writing centric. People who don’t understand what writing does and what’s involved, they just look at the output, look at the actual copy it creates. What they don’t realize is there’s the, there’s another outcome of writing and it’s what it actually does to you in your brain.

The process of writing allows you to think through problems, identify gaps in your reasoning, forces you to find other resources, maybe what you’re missing. When you write your thoughts out, it literally makes you smarter. It’s not an overstatement. It literally, it forces you to like think through things better and you become smarter.

And if you robbed yourself of that, you’re just going to stay dumb. And if you swap out AI writing for recruitment tech, if you don’t force yourself as a human to learn how to fix a problem with your recruiting process and your hiring process, it’s always going to be bad and you’re going to be bad at hiring and bad at recruiting.

Always. Best point you’ve made in years, James. I stole it from Jack, so. Yeah, alright. Takeaway. Because we’re woefully long. AI is, to your point, it’s just going to put bad automations on steroids. Anything that makes the candidate experience fall apart is going to be accentuated. That’s kind of the main takeaway. You have to have real life, knowledgeable human beings on hand to fix things when they inevitably get broken.

And there’s always a life cycle of a process working and then breaking under scale. Like if you are a company that is excelling, that is a natural part of the growth process. So based on the reality of how we interview, we already don’t have enough human beings involved in those processes. Number two, as with everything technology related, AI should be viewed as an opportunity to decrease redundancy.

That is it. Does it make the menial tasks of your day easier? Of course it does. Unequivocally, yes. Invest in it. Great. Just don’t convince yourself it’s a silver bullet to automating your entire recruiting cycle. Yeah. Third takeaway, number three, people accept jobs because they like the people they’re going to work with.

Humans like humans. It’s vibes. Remember that your company, like that’s not going to change. Some AI gadget isn’t going to change the dynamic there. People are going to accept the job to work with you, with your company because they like you, they like everyone else they interviewed with. Like that’s it.

That’s the primary thing. And last thing I’d say, point number four, humans have to solve hiring problems before we can even train AI to do it. Simple as that. It can’t solve a problem for us that we ourselves don’t know how to already solve. Yep. So don’t come at us if it doesn’t fix a redundancy, at least for the time being.

Yeah. We are short on clock. That’s a wrap for this week. Thanks for tuning in the 10 Minute Talent Rant, part of the Talent Insights series, which is always available for replay on talentinsights.hirewell.com as well as YouTube, Apple podcasts, Google podcasts, Spotify, and Amazon. Jeff, thanks again as always. Everyone out there, we will see you soon.

Episode 94
Why is everything moving so slow in hiring right now? Every recruiter we talk to – internal and external – says the same...

Episode 90
The 10 Minute Talent Rant, Episode 90 “Gen Z’s 12% Unemployment, Explained” It came as a shock to pretty much everyone that the...

Episode 89
“Recruiting Is Not The Goal. Hiring Is.” – The 10 Minute Talent Rant is LIVE What do you call recruiting that doesn’t result...

Our Shows

Our Latest Blog