Morning interviews may yield higher scores due to interviewer bias, research shows.
Bias in hiring can be influenced by the time of day, affecting candidate evaluations.
AI tools could reduce this, offering fairer assessments than manual methods.
If you get to choose when to schedule a job interview, you might want to grab a coffee and go for a morning slot.
That's because some people conducting interviews tend to give higher scores to candidates they meet with earlier in the day compared with the afternoon, a startup's review of thousands of interviews found.
It's not an absolute, of course, and candidates can still kill it well after lunchtime. Yet, in a job market where employers in fields like tech have been slow to hire, even a modest advantage could make a difference, Shiran Danoch, an organizational psychologist, told Business Insider.
"Specific interviewers have a consistent tendency to be harsher or more lenient in their scores depending on the time of day," she said.
It's possible that in the morning, interviewers haven't yet been beaten down by back-to-back meetings β or are perhaps still enjoying their own first coffee, she said.
Danoch and her team noticed the morning-afternoon discrepancy while reviewing datasets on thousands of job interviews. Danoch is the CEO and founder of Informed Decisions, an artificial intelligence startup focused on helping organizations reduce bias and improve their interviewing processes.
She said the inferences on the time-of-day bias are drawn from the datasets of interviewers who use Informed Decisions tools to score candidates. The data reflected those who've done at least 20 interviews using the company's system. Danoch said that in her company's review of candidates' scores, those interviewed in the morning often get statistically significant higher marks.
The good news, she said, is that when interviewers are made aware that they might be more harsh in the afternoon, they often take steps to counteract that tendency.
"In many cases, happily, we're actually seeing that the feedback that we're providing helps to reduce the bias and eventually eliminate the bias," Danoch said.
However, she said, interviewers often don't get feedback about their hiring practices, even though finding the right talent is "such a crucial part" of what hiring managers and recruiters do.
She said other researchers have identified how the time of day β and whether someone might be a morning person or an evening person β can affect decision-making processes.
An examination of more than 1,000 parole decisions in Israel found that judges were likelier to show leniency at the start of the day and after breaks. However, that favorability decreased as judges made more decisions, according to theΒ 2011 research.
Tech could help
It's possible that if tools like artificial intelligence take on more responsibility for hiring, job seekers won't have to worry about the time of day they interview.
For all of the concerns about biases in AI, partiality involved in more "manual" hiring where interviewers ask open-ended questions often leads to more bias than does AI, said Kiki Leutner, cofounder of SeeTalent.ai, a startup creating tests run by AI to simulate tasks associated with a job. She has researched AI ethics and that of assessments in general.
Leutner told BI that it's likely that in a video interview conducted by AI, for example, a candidate might have a fairer shot at landing a job.
"You don't just have people do unstructured interviews, ask whatever questions, make whatever decisions," she said.
And, because everything is recorded, Leutner said, there is documentation of what decisions were made and on what basis. Ultimately, she said, it's then possible to take that information and correct algorithms.
"Any structured process is better in recruitment than not structuring it," Leutner said.
Humans are 'hopelessly biased'
Eric Mosley, cofounder and CEO of Workhuman, which makes tools for recognizing employee achievements, told BI that data created by humans will be biased β because humans are "hopelessly biased."
He pointed to 2016 research indicating that juvenile court judges in Louisiana doled out tougher punishments βΒ particularly to Black youths βΒ after the Louisiana State University football team suffered a surprise defeat.
Mosley said, however, that AI can be trained to ignore certain biases and look for others to eliminate them.
Taking that approach can help humans guard against some of their natural tendencies. To get it right, however, it's important to have safeguards around the use of AI, he said. These might include ethics teams with representatives from legal departments and HR to focus on issues of data hygiene and algorithm hygiene.
Not taking those precautions and solely relying on AI can even risk scaling humans' biases, Mosley said.
"If you basically just unleash it in a very simplistic way, it'll just replicate them. But if you go in knowing that these biases exist, then you can get through it," he said.
Danoch, from Informed Decisions, said that if people conducting interviews suspect they might be less forgiving after the morning has passed, they can take steps to counteract that.
"Before you interview in the afternoons, take a little bit longer to prepare, have a cup of coffee, refresh yourself," she said.
Elon Musk and Vivek Ramaswamy have plans for DOGE, and Workday sees an opportunity.
Workday aims to capitalize on federal agencies' shift from on-premises to cloud systems, its CEO said.
The federal government, the largest US employer, could face layoffs under DOGE's agenda.
As Elon Musk and Vivek Ramaswamy gear up to try to reshape large swaths of the federal government, one big software player sees an opportunity.
Workday, the human-resources-software company that workers love to hate, is embedded in more than half of Fortune 500 companies. The $72 billion company has been building up its government customer base, from Oklahoma's Tulsa County to the US Department of Energy. In 2022, Workday was approved to work with the federal government.
Now that Musk and Ramaswamy's Department of Government Efficiency is set to advise President-elect Donald Trump on rescinding regulations and cutting administrative costs, Workday and other government vendors could stand to benefit.
On Workday's Tuesday earnings call, CEO Carl Eschenbach addressed an analyst's question about how DOGE could impact Workday's business.
Eschenbach said more than 80% of the federal government's HR systems were physically housed on local servers, what's called "on-premises." Companies and organizations have been steadily migrating from on-premises servers to the cloud for cost savings, better security, and efficiency, among other benefits.
"Postelection and with DOGE coming out, people are absolutely looking to drive more economies of scale and more efficiency. And I can tell you supporting these on-premises, antiquated systems is not a way to do that," Eschenbach said.
Eschenbach added that federal agencies were at an "inflection point" and ready to move to the cloud β and Workday has a government-focused product to sell them.
"We think this will only be a tailwind for us as we think about the federal government business going forward," he said.
Workday said in May that it would work with the Department of Energy and the Defense Intelligence Agency.
"These are critical wins for us and it's actually driving demand for us in the federal government as people recognize Workday is really pushing hard into that market," Eschenbach said on Tuesday's call.
In the last quarter, Workday brought in $2.2 billion in revenue β a 16% increase from last year. The company doesn't break out revenue by customer type. Workday's stock is up 14% in the past year.
The company didn't respond to a request for comment sent outside business hours.
Last week, Musk and Ramaswamy named several of DOGE's targets in a Wall Street Journal opinion column: work-from-home arrangements, Planned Parenthood, the Corporation for Public Broadcasting, and general head count, among others.
"DOGE intends to work with embedded appointees in agencies to identify the minimum number of employees required at an agency for it to perform its constitutionally permissible and statutorily mandated functions," the pair wrote.
The federal government is the largest employer in the US, with a workforce of more than 2 million Americans, so the group's suggestions could have wide-ranging implications.
The Washington Post reported on Sunday that notable Silicon Valley figures β including the Palantir cofounder Joe Lonsdale, the investor Marc Andreessen, the hedge-fund manager Bill Ackman, and the former Uber CEO turned food-tech entrepreneur Travis Kalanick β had been involved in DOGE's early planning.
This article is part of "Workforce Innovation," a series exploring the forces shaping enterprise transformation.
Diversity, equity, and inclusion programs have become the subject of a heated, politicized debate over the past few years.
Several major corporations, including John Deere, Microsoft, and Molson Coors, have made headlines recently for rolling back their DEI initiatives.
Meanwhile, Walmart, the world's largest retailer, announced it would no longer use the acronym in its communications and would not extend its Center for Racial Equity, a nonprofit established in 2020 with a five-year, $100 million commitment to address racial disparities.
Even so, as we've reported in this series, many companies remain committed to the values of DEI β but are shifting their strategies for a new era. Whatever the motivation of the companies, it's clear that DEI is undergoing a period of change.
Business Insider asked its Workforce Innovation board to participate in a roundtable to discuss how DEI programs are evolving. We wanted to find out what structural changes are happening, how companies can continue to build trust with employees, and what role artificial intelligence is poised to play.
The consensus around the virtual table was that the focus of the DEI story is shifting to business outcomes and the skills needed to achieve them. "We can't do it the old way," Purvi Tailor, the vice president of human resources at Ferring Pharmaceuticals, said. "We have to have the conversation in a new way. It becomes much more about inclusion and changing mindsets and creating awareness about your own biases."
Skills-based hiring is one way companies are working to identify diverse candidates organically. "Let's focus on the skills that are required for the future of work and what we are looking for from leaders in our company," Maggie Hulce, the chief revenue officer at Indeed, said. "And then be more consistent in the application of holding that bar."
By homing in on the skills organizations need to succeed and how to use AI tools to help surface in-house talent, companies could move the DEI story away from conflicts and focus on its benefits.
"It dismisses this notion that you have to lower the bar if you want diversity in your organization," said Spring Lacy, the global head of talent acquisition and DEI at Verizon. "We've got lots of super smart, super skilled people of color, women, people with disabilities, LGBTQI community, who just aren't seen for all of the biases that you talked about. You don't have to lower the bar."
Roundtable participants included:
Anant Adya, executive vice president, service offering head, and head of Americas Delivery, Infosys
Lucrecia Borgonovo, chief talent and organizational effectiveness officer, Mastercard
Chris Deri, president, Weber Shandwick Collective
Maggie Hulce, chief revenue officer, Indeed
Spring Lacy, vice president, chief talent acquisition and diversity officer, Verizon
Purvi Tailor, vice president of human resources, Ferring Pharmaceuticals USA
Here are six key takeaways from the discussion.
Skills-based hiring, supercharged with AI tools, helps companies find 'hidden figures'
Skills-based hiring is a strategy that some companies are using to identify candidates and reduce bias in the hiring process. The approach focuses on the skills needed to fulfill the role, minimizing qualifications like college degrees or previous job titles.
With artificial intelligence, talent leaders can accelerate the hiring process and uncover strong candidates within their companies that they might have missed before.
Lacy, who was previously an HR leader at Prudential, said AI is empowering existing employees to showcase their abilities more effectively.
"When went to recruit internally, and we pulled people based on the skills profile and not based on proximity bias or any other bias, our slates were inherently more diverse," Lacy said.
The critical piece for companies is to figure out the best way to capture an accurate and comprehensive view of employees' skills.
Verizon uses the Workday HR platform and is piloting a program with its partner company, Censia, that uses an AI tool to help employees craft their profiles.
Lacy has seen how difficult it can be for employees to isolate their skills in ways that might help them be identified for new opportunities. "When we said to employees, 'Go build a skills profile,' the page was blank," she said. "It was really hard for people to get started." AI tools can pull information from a range of sources and serve up a framework that guides employees through the process.
Mastercard has launched an employee-skills initiative with the software company Gloat. "It has been a really great way to democratize access to opportunities for employees," said Lucrecia Borgonova, Mastercard's chief talent and organizational effectiveness officer.
The outcome for companies can be a more diverse talent pool from inside the house.
Lacy said Verizon is conscious of the potential for bias in the AI programs, but early indicators suggest that more individuals are being considered for roles than in the past.
"We are uncovering hidden figures in this organization because there are people who we don't know, because they are not well networked, they don't have sponsors," Lacy said. "If not for this technology, we wouldn't have known that they were there, to be able to lift them and perhaps provide them with other opportunities."
Leaning into the 'I' of DEI β inclusion
DEI programs have many aspects, including employer branding and attracting a diverse talent pool, screening and hiring, and compensation.
Inclusion relates to a person's workplace experience and their sense of belonging at an organization, which research suggests makes people want to join and stay at a company. Benefits are an essential part of that employee experience, and companies may want to think about how these packages reflect their values to staffers and prospects alike.
Ferring Pharmaceuticals introduced a program in 2022 that includes unlimited financial support for creating a family β through IVF, adoption, surrogacy, or birth β for all employees, regardless of gender or sexual orientation.
Ferring's Tailor said it is one way that the company emphasizes its approach to its entire workforce.
"We talk about more of the 'I' than we do about the 'D' and the 'E,'" Tailor said. "We do it to show the kind of culture and working environment that we want to have. It's all about inclusion and bringing your whole self to the workplace."
Linking AI tools with culture and leadership
As companies develop new hiring strategies, culture does not stand still.
"Inclusion and belonging are essential parts of the culture, the value proposition, and key to driving the outcomes of our business," said Mastercard's Borgonovo."It's really important that we drive shared accountability across our 34,000 employees around the role that each of us has to collectively play in creating this culture of inclusion where everybody feels that they can belong."
Borgonovo said that Mastercard is exploring ways to leverage AI to help business leaders across the organization improve efficiency and be more intentional about DEI and other workforce goals.
"How do we enable people, leaders, from an automation or efficiency standpoint? How do we help them be more proactive?" she said. "How do we help them create more bandwidth by automating certain processes so then they have more time to coach and develop their teams."
She said the company is exploring how AI can be used to coach leaders to role-play and get feedback on how they engage with their teams. "AI can be your coach, your copilot, and help augment your leadership," she said.
Ditching the DEI silo
Indeed's Hulce said a lot of time goes into optimizing the company's structure. "How do you make it the norm that equity needs to be built into processes, period," she said.
It's not just about interviewing and hiring diverse candidates, but about leading teams through every opportunity and decision, including promotions, performance bonuses, and assignments.
"How do you measure that? How do have regular conversations with managers at different levels in the organization about the expectation that we will be looking at equity in all of these steps," Hulce said.
Indeed once had a DEI team that worked parallel to the HR function. But when the previous HR leader left the company, they decided to reorganize and embed the DEI discipline across the business, elevating the previous head of DEI to chief people officer.
Hulce said realigning DEI was essential to scaling goals, standards, and accountability across the company. "It's almost an impossible task to ask a separate group to influence everybody else unless it's built into core processes somehow," she said.
Infosys is also considering its optimal DEI structure."We are slightly decentralized," Anant Adya, an executive vice president, said. The global company has a centralized corporate DEI team, with DEI councils at the individual industry units. Adya said the company will leverage AI tools to help measure effectiveness.
Hulce emphasized the need to regularly and consistently review management decisions. "It can't be just once a year," she said. "You evaluate, you check, and if there's a correction to be made, you say, 'OK, guys, something looks amiss.' The expectation is we will be following equitable processes."
Using AI to scrutinize hiring, while retaining the human touch
Adya said Infosys is using AI to analyze patterns in its hiring data.
"It is very important to look at and analyze the data based on how hiring patterns are being used and if there is any bias in the hiring process itself," he said.
AI will grow increasingly important in analyzing the efficacy of various recruitment sources. "A lot of times we see that employee referrals actually work the best," he said. "But that might not be true when it comes to specific DEI initiatives."
By enlisting AI tools to analyze online sources, university partnerships, and other talent alliances and platforms the company is using, Adya said it should be able to optimize its approach around specific goals.
But all the AI analysis in the world does not negate the need for the human touch. Adya said that sometimes there's a perception at the company that hiring is being done only to hit certain DEI benchmarks and that the process is too onerous.
Adya said that hosting a "clear dialogue" about the company's decision-making process around recruitment methodology has helped employees understand the company's rationale.
"It's always better to sit down and explain why this is critical for the unit and why it is important," he said. "Sometimes open dialogues, going back to the old school, not using AI or gen AI, but just sitting and talking and removing that uncertainty and lack of transparency helps a lot."
Leveraging AI-powered insights to change the DEI story
Proponents of DEI maintain that a diverse, inclusive workplace yields better business results, and there are studies that also support that view.
Opponents of DEI, said Chris Deri, the president of Weber Shandwick's corporate advisory business, tend to focus on the methodology of achieving workplace diversity, such as companies actively seeking women for leadership positions, seemingly at the expense of male candidates.
"That's what DEI opponents are focused on," Deri said. "Like, how do you pull together a candidate pool, like having women candidates somehow be seen to be at the front of the line."
Deri said that companies should work to shift the perspective to DEI outcomes and tangible business benefits β and should leverage artificial intelligence to surface insights that might not be obvious.
"AI can do that in a way that human knowledge management and analysis is not going to be able to do," Deri said. "We can use the power of AI to look across our enterprises' data and knowledge and start to collect the outputs and outcomes of the principles of applying DEI. "
Deri said that if a large language model can be trained on the outcomes, such as attracting new customers, creating new products, and building community trust, "that might be something that uses technology to help the storytelling about DEI. We really need to change the entire story now."