Restless Bandit Blog

Your best
candidates are
hiding in plain sight

Hand over your iPhone, and put on this tinfoil hat

Posted by Steve Goodman on Jun 8, 2017 7:48:20 AM
Find me on:

In his article, In 10 Years, Job Hunting will be Obsolete, Geoffrey James extrapolates from my quotes to raise serious concerns about privacy and transparency that should trouble us all. But he misses the point as it pertains to AI and big data in hiring—the type of data used, and whether it’s appropriate or not, isn’t inherent to the technology.

All algorithms operate and learn within a defined set of parameters and rules, just like every company's recruiters, accountants, and marketers. They don’t have their own agenda; they’re a tool, and they work with what they're given. The Talent Rediscovery algorithms we developed work from publicly available profile data—essentially anything that can be found in a Google search—and resumes that have been submitted to a company by job applicants.

There's no question that our privacy is at risk now more than ever. We've seen that in recent large-scale hacks of global systems, and it's losing protections at the hand of the current administration. Privacy is a critical issue for sure, but one that is separate from the use of algorithms and big data in recruiting. Simply put, the technology isn’t responsible for privacy violations.

As for a "secret algorithm applying secret rules" (paranoid much?), well of course that could happen, if a human or company built it that way. But even if such a sinister algorithm did exist, it’s no different than an unscrupulous human gaining unauthorized access to prejudicial information to deny you a mortgage, or a job, or a credit card. And if any employer actually runs an algorithm that determines you're likely to steal, and they fire you based solely on that, then you should be glad to get the hell out of there. 

There are a lot of “whys” we never know the answer to in life, whether or not technology is involved. We've all been rejected for a job, passed over for a promotion, or failed to be picked for a sports team on the grade school playground by humans. Usually we don’t even know why. Algorithms may not be completely transparent, but neither is human logic, opinion, or bias.

The 4th amendment, as Mr. James so rightly points out, needs to be vigorously defended for all of our sakes. Most of us would agree that things like browser history should never be accessed by any employer—or the government—without our express consent or a proper search warrant. That's a legislative issue—and it goes far beyond technologies used in recruiting. Other things you post publicly, voluntarily, and with no expectation of privacy—like a LinkedIn profile or a tweet—is fair game. But no matter the information, we need to be vigilant and active in defending our privacy, and blaming technology is a red herring and dangerous distraction. 

We live in a time where data is captured in vastly greater volume and in exponentially more ways than ever before. Data is knowledge. Knowledge is power. And anything powerful, as we know, carries risks proportionate to its considerable benefits.

But if we chose to live in fear instead of embracing innovations that can improve our lives, we would all have flushed our iPhones, canceled our internet providers, and gone off the grid a long time ago.

Innovation is a calculated risk. And always has been.

It’s also the way forward. And always will be.

Topics: AI, privacy, artificial intelligence, machine learning, recruiting, talent acquisition, HR Tech

Subscribe to email updates

Recent posts