Uploading Resumes into Generative AI: A Practical Privacy Check for Hiring Teams

Data Privacy

Data Privacy Day 2026 is a useful moment to ask the question: do we genuinely know how candidate data is handled? Recruiters and Hiring Mangers find themselves working in a new environment – one that involves easy to use Generative AI tools installed on our  phones, computers and within our hiring systems.

Why it’s being uploaded into generative AI tools?

In practice, many Hirers are using generative AI to:

  • Summarise resumes and LinkedIn profiles
  • Rewrite candidate profiles for hiring managers
  • Compare shortlists using pasted CV content
  • Turn interview notes into feedback summaries
  • Draft candidate communications

Resumes contain personal data and often sensitive information. When uploaded into public or personal AI tools, organisations lose visibility over where that data goes, how long it is retained, and whether candidates have consented to that use.

Even when AI providers offer assurances, individual recruiters are not equipped to assess configuration settings, training use, or jurisdictional risk.

 

The Public versus Private GPT environments

Recruiters will find themselves working in one of two environments.

Public or Open-loop AI means your organisation and employees are using various publicly available tools – once where data leaves the organisation’s controlled systems and cannot be fully monitored, governed, or retrieved. Actions sit outside of the ATS and outside of secure enterprise-approved systems.

The Risk: From a governance perspective, this creates a parallel hiring process that most organisations do not audit or document and significant risk of operating in a way that is not compliant with your Privacy Act obligations.

Private or Closed-loop AI means your Organisation has created their own enterprise system where data remains within the organisation’s controlled environment, with defined access, security controls, and governance. This allows employees to upload company reports, financials, pricing and strategies knowing that the right level of privacy and control exists.

The Risk: Over-confidence. The system may be restricted to that Organization, but not everyone in that organisation should have access to Candidate data. It still requires governance and controls specific to candidate data use.

 

Marrin-Boyd Andrews, former Global Talent Sourcing Manager at Fonterra was part of a team that implemented a closed-loop system.

“Putting AI into a closed-loop environment changed how it could be used in real hiring and workforce decisions. When all hiring data lived inside the enterprise, governed by the same controls as core HR systems, AI stopped feeling experimental and ran with clear expectations around trust, privacy, and reliability.

What made it work was ownership, not just technology. Leaders set the rules on what data could be used, how AI could shape decisions, and how outputs could be shared. The closed-loop model created a safe operating space, but governance made it credible.”

National Workplace Relations Leader, Nick Duggal of Moray & Agnew says…

Candidate data is personal and potentially sensitive information that falls within Privacy regulations. As candidates are not employees, the employee records exemption may not apply. Hirers should have a Privacy Policy that informs candidates how their personal information should be handled, including any third parties who are granted access to that information. Even if an AI System is closed loop, it may still be enabling a third party AI program provider to access that information. A recruiter removing a person’s name from personal data that is circulated will not necessarily eliminate all privacy regulation risks. For example, there may be other identifying information within the data. 

In my view therefore hirers should outline in their Privacy Policies how AI may be utilised to aggregate candidate personal information, and avoid situations where personal data is circulated to third parties beyond the control of the candidate or Hirer. A failure to adhere to Privacy law requirements can lead to claims for both penalties and damages.  

 

A simple diagnostic for leaders

If you cannot confidently answer these questions, you likely have a visibility gap:

  • Do all stakeholders (eg. recruiters, hiring managers and 3rd party vendors/agencies) who have access to candidate data know which AI tools are approved and which are not?
  • Have we explicitly said what candidate data can and cannot be uploaded into AI tools?
  • Would our recruiters and hiring managers give consistent answers if candidates asked how AI is used in hiring?
  • Do we provide a safe alternative, or are recruiters improvising?

 

Actionable checklist for HR and TA leaders

Open-loop AI systems (Public AI tools, personal GPT accounts, browser extensions)

In open-loop systems, the checklist is about preventing loss of control.

The primary risk is that candidate data leaves the organisation’s environment entirely. Leaders cannot see where the data goes, how long it is retained, or who else may access it. In this context:

  • “What data must never be uploaded” is a hard stop, not guidance
  • Silence may be interpreted as approval, so explicit rules are essential
  • Safe alternatives matter because productivity pressure drives behaviour
  • Third-party use is high risk and must be tightly constrained

Open-loop risk is about exposure and requires strong boundaries.

 

HR Technology with Generative AI Capability

An additional layer is that modern Hiring Technology often includes generative AI features, allowing you to compare Resumes to the Position Descriptions and create candidate summaries for Hiring Managers.  However, the issue of people working within the system or out of the system, still represents a risk that bears understanding, guardrails and clear procedure. If your ATS does not include this capability, it’s tempting for a Recruiter to use external tools.

 

The takeaway

Data Privacy Day asks us to “take control of our data” and “prioritize privacy by design”. We are all in new operating environments, and it bears repeating that productivity gains only hold if they are matched by deliberate governance, clear boundaries, and leadership ownership of how candidate data is used, protected, and explained.

 

 

Article By

Get more articles direct to your inbox

Upcoming Events

AI Uplift Workshop: Play Ball with AI

June + July 2026

Annual Conference: ATC2026 UPLIFT

13 & 14 October 2026

Long Lunch Series for Talent Leaders

Ongoing

Restaurant Bar

You may also enjoy reading...

Your workforce strategy is the engine that powers your organisational goals and when you align how you find people with where the company is headed, TA becomes a primary driver of business success. Yet many teams still face fragmented systems, limited talent visibility, and processes that struggle to keep pace. The next phase of Talent…
In 2023, Abano faced acute workforce shortages, particularly in clinician and regional roles, exacerbated by legacy systems, fragmented recruitment processes, and heavy reliance on recruitment agencies. To support business growth and access to care, Abano needed to rapidly scale hiring capacity while improving quality, reducing costs, and delivering a markedly better experience for candidates and hiring managers.
My goal wasn’t to warn people or reassure them about the ongoing wave of AI-fueled transformation. I wanted to recalibrate everyone’s expectations about AI and what it means for workforces around the world.