AI is changing how we work, how we get hired... or don't. From résumé screening to automated interviews, artificial intelligence is increasingly shaping the recruitment process.
But beneath the surface of speed and efficiency, a more complicated question emerges:
What if the tools we’re using to reduce bias... are actually making it worse?
There is an ethical crossroads we find ourselves at, a place where convenience meets consequence. If you are a job seeker, it might feel like you’re auditioning for an algorithm... For HR professionals, it raises hard questions about fairness, transparency, and responsibility.
Let’s break it down - what’s happening, why it matters, and what each of us can do
At first glance, AI in recruitment might sound like a big win-win.
No more buried résumés, no more hiring based on gut feelings. Just smart tools, doing the hard work.
Here’s what’s typically happening:
CV scanners look for keywords or rank candidates based on past data
Video interview bots analyze your tone, language, and even facial expressions
Automated assessments score your personality, skills, or emotional intelligence
These tools promise to be objective, fast, and scalable. But behind the automation is a truth we can’t ignore: AI learns from data... and that data can often reflect the biases we all have as humans.
Take this in for a moment:
Over 60% of Australian organisations used AI to help with hiring last year
There are no national laws regulating how these systems are used
Many tools are still experimental, biased, or even scientifically discredited
And yet — they’re being used to reject candidates before a human ever sees them
This matters, because these systems don’t just sort applications, they can exclude people like you and me from opportunities altogether.
For example:
Amazon scrapped its AI hiring tool after it downgraded CVs with the word “women’s”
HireVue faced complaints in the U.S. after Deaf and neurodivergent applicants were scored unfairly on communication
One U.S. tutoring company programmed its tool to automatically reject women over 55
These are not just bugs, they’re serious design flaws, and they carry real human costs.
Dr. Natalie Sheard, a postdoctoral fellow at the University of Melbourne, has spent years studying this.
Her research revealed something stark:
“AI hiring tools may enable, reinforce, and amplify discrimination — especially against already disadvantaged groups.”
These tools can:
Penalise non-standard English (hurting those from migrant backgrounds)
Score lower on “positivity” if you're depressed or neurodivergent
Prefer traditionally “masculine” language like “executed” or “actioned”
Misread facial expressions — particularly across racial and cultural lines
And because the systems are often proprietary and opaque, it’s hard to know when this is happening... or to challenge it when it does. That’s the real risk, that you might be filtered out, and you may never even know why.
If you’re job hunting and feel like the process has become weirdly robotic, I feel you. You might spend hours tailoring a CV, only to get ghosted, face automated questions that feel designed to trip you up, get rejected without any clear feedback — or any human contact.
It’s frustrating. And worse, it can feel personal.
But here’s the key insight: It’s not about you. It’s about the system. You may be a great fit, but if the AI screening tool isn’t trained to recognize that, you never get a chance.
So what can you do?
🧠 Work with the system
Optimise your résumé and cover letter for AI scanners. Use keywords from the job description. Make it clean, clear, and structured.
Get my free cover letter template and you will receive a special offer for my AI & ATS optimised resumé template!
🎤 Prepare for “robo interviews”
Practice answering questions on camera. Speak clearly, use examples, and stay concise. Smile, even if it feels awkward at first. Yes, it’s performative and weird af, but it helps.
👥 Seek human contact
Where possible, try to network directly with hiring managers. A referral or warm intro can get your application out of the algorithm’s black hole.
📣 Know your rights
Laws are evolving, but you can ask about how your data is used. And if you suspect bias, it’s worth documenting your experience.
If you work in hiring, you’ve likely felt the pressure to automate — especially when flooded with hundreds of applications. AI tools can save time, reduce admin, and promise consistency.
But let's also reflect: at what cost?
When we delegate decision-making to a machine, we risk:
Missing out on great candidates who don’t fit the algorithm
Amplifying past hiring biases baked into the system’s training data
Facing reputational or even legal consequences if discrimination occurs
Use AI as a tool, not a decision-maker. It can help screen, but a human should always have the final say.
Audit your systems regularly. Look for patterns in who’s being rejected, and why.
Demand transparency. If you don’t understand how a tool works, ask. Push vendors for explainability.
Be proactive about inclusion. That means training your team, updating your policies, and staying open to feedback from applicants.
AI in hiring isn’t going away. But we’re still figuring out how to use it responsibly and ethically.
The tech is powerful, and I'm all into what emerging technologies are opening up for us on all levels. But there are also risks, and we need to be aware of them.
I'd say that the real challenge isn’t just technical, it’s human. It’s about the values we stand for. Let's ask ourselves if we are creating systems that work for everyone, or just for some.
Until laws catch up, we need awareness and vigilance. From employers, from jobseekers, from policymakers, and from the people building these tools.
The goal isn’t to reject technology. It’s to make sure it serves us, all of us, fairly.
AI in recruitment is here to stay, but how it's used is still up for debate 🥲
🔹 Have you experienced AI influence in the hiring process, as a job seeker or HR professional?
🔹 Do you think these tools are helpful, harmful, or something in between?
🔹 What needs to change to make the process more fair and transparent?
Drop a comment below and share your experience, I'd really love to hear what you think!
ABOUT
Helping young professionals identify their career direction, develop their skills, and achieve their professional aspirations.
QUICK LINKS