05-14-2019 01:57:15 PM -0400
05-09-2019 05:01:30 PM -0400
05-09-2019 01:41:48 PM -0400
04-18-2019 10:46:35 AM -0400
04-18-2019 10:18:40 AM -0400
It looks like you've previously blocked notifications. If you'd like to receive them, please update your browser permissions.
Desktop Notifications are  | 
Get instant alerts on your desktop.
Turn on desktop notifications?
Remind me later.


Former Google, Facebook Execs Launch AI Human Resources Firm to ‘Remove Human Bias'

Futuristic head with DNA strands and code suggesting artificial intelligence AI.

Two longtime Silicon Valley tech players have launched an artificial intelligence (AI) big data human resources (HR) company, intending to remove "human bias" from recruiting, hiring, and professional development. Their company shoots for a "holistic" approach to employment, to benefit companies and employees.

"Employment is the backbone of society and it is a hard problem," explained Eightfold.ai CEO Ashutosh Garg, a former search and personalization expert at Google and IBM research."People pitch recruiting as a transaction [but] to build a holistic platform is to build a company that fundamentally solves this problem," making work the most meaningful for the most people.

Garg teamed up with chief technology officer Varun Kacholia, who led the ranking team at Google and YouTube search and the News Feed team at Facebook. Eightfold.ai, which has $24 million in funding provided by Lightspeed Ventures and Foundation Capital, already serves more than 100 customers using its tools across different industries, TechCrunch reported. It has processed more than 20 million applications, and increased response rates among its customers by 700 percent, while reducing screening costs and time by 90 percent.

Garg and Kacholia aim to remove the human biases from recruiting, hiring, professional development, and advancement by using big data and AI to construct the prototype of an ideal workforce. Eightfold.ai analyzes publicly available data collected from across the world that can be parsed and analyzed to create a Platonic ideal of any business in any industry.

The ancient Greek philosopher Plato suggested that the perfect working order of a government or human being required the right ordering of all working parts, and he structured an ideal government in the dialogue The Republic. Eightfold.ai operates on the idea that such an ideal workplace arrangement can be devised and personalized for specific businesses and industries.

With such an ideal crafted, Eightfold can apply it to a real company's workforce to optimize existing roles and devise hiring strategies.

"We have crawled the web for millions of profiles ... including data from Wikipedia," Garg told TechCrunch. "From there we have gotten data round how people have moved in organizations. We use all of this data to see who has performed well in an organization or not. Now ... we build models over this data to see who is capable of doing what."

Eightfold.ai also focuses on helping employees reach their full potential. After drafting a "talent graph" for the company, "we map how people have gone from one function to another in their career," Garg explained. "Every individual with the right capability and potential placed in the right role is meaningful progress for us."

Echoing the holistic and vaguely spiritual push of the company, Garg and Kacholia took the name from Buddhism's "Eightfold Path" to Nirvana or "enlightenment." Their real faith seems more placed in technology, however.

“Many of the biases people have in recruiting stem from the limited data people have seen,” Garg argued. “With data intelligence we provide recruiters and hiring managers powerful insights around person-job fit that allows teams to go beyond the few skills or companies they might know of, dramatically increasing their pool of qualified candidates."

"Our diversity product further allows removal of any potential human bias via blind screening," the CEO said (emphasis added). "We are fully compliant with EEOC and do not use age, sex, race, religion, disability, etc in assessing fit of candidates to roles in enterprises.”

Peter Nieh, a partner at Lightspeed Ventures, echoed this faith in technology. "Eightfold.ai has an incredible opportunity to help people reach their full potential in their careers while empowering the workforces of the future," Nieh said in a statement. “Ashutosh and Varun are bringing to talent management the transformative artificial intelligence and data science capability that they brought to Google, YouTube and Facebook."

How much, exactly, can big data and AI transform? While human bias can indeed prevent companies and individuals from making the right hiring decisions, machines often develop a bias of their own.

"Algorithmic bias is shaping up to be a major societal issue at a critical moment in the evolution of machine learning and AI," the Massachusetts Institute of Technology's Technology Review reported last June. "If the bias lurking inside the algorithms that make ever-more-important decisions goes unrecognized and unchecked, it could have serious negative consequences, especially for poorer communities and minorities."

Algorithm bias goes beyond the common liberal refrain of intersectionally "oppressed" groups like racial minorities, however. Conservatives have felt the brunt of YouTube's algorithm bias. Silicon Valley, the heart of America's technological advancement, is notoriously biased against conservatives.

All the same, Eightfold.ai seems primed to discover important truths about how the most successful firms are structured, and how employment decisions can avoid the pitfalls of failing businesses. Big data and AI can help companies like Eightfold.ai discover important truths about what best motivates successful employees, and how to connect workers with relevant experience to the right job, even if it means crossing different fields.

Garg and Kacholia just need to keep a humble perspective on the "transformative" power of technology. Machines enable humans to achieve a great deal, but at the end of the day every computer code is subject to human error, and bias does not disappear when translated into ones and zeroes.