At ATC2025, the session “Canva Redesigned Tech Interviews for the AI Era” was a standout example of how hiring practices are catching up with the realities of modern work. I found this session both refreshing and thought-provoking. Canva’s approach challenges the old model of technical interviews by recognising that coding in 2025 isn’t a solo act, it’s a collaboration between humans and AI. Rather than testing memory or syntax, their process evaluates how candidates think, prompt, and problem-solve alongside intelligent tools. It felt like a glimpse into the future of hiring.
Takeaways
The session highlighted Canva’s innovative approach to integrating AI into technical interviews. The most impactful insight was the emphasis on transparency – encouraging candidates to demonstrate how they prompt AI tools to generate code, review outputs, and fix bugs. Moreover, candidates are informed prior to the interview that they will be expected to use AI tools during the process. They are encouraged to share their preferred AI tools with the Canva team, and Canva will provide access to those tools during the interview so candidates can showcase their coding capabilities. This approach allows hiring managers to assess not just coding ability but also prompt engineering skills, which are becoming critical in an AI-driven world. Another key takeaway was Canva’s AIP (AI-Assisted Programming) initiative, which focuses on leveraging AI to produce robust solutions and uplift coding capabilities. The future of coding is collaborative developers will partner with AI to generate clean, efficient code while maintaining accountability. AI-driven interviews enable the assessment of critical skills while ensuring a structured, high-quality process that aligns internally across stakeholders.
How It Made Me Feel or Think
The session reinforced the idea that technical interviews are evolving beyond traditional coding tests. It made me think about how AI is not replacing developers but augmenting their capabilities. The concept of assessing a candidate’s ability to work with AI tools feels forward-thinking and practical. It also sparked thoughts about how organizations need to adapt their evaluation frameworks to measure skills like prompt engineering and AI collaboration.
Questions I Have Walked Away With
How do we standardize the evaluation of prompt engineering skills across different roles and levels?
What safeguards should be in place to ensure fairness when candidates use AI tools during interviews?
How do we balance AI assistance with assessing a candidate’s core problem-solving ability?
What training or enablement do hiring managers need to effectively evaluate AI-driven coding approaches?

A message from ATC Team:
ATC2025 gave us so many moments worth remembering – insightful sessions, honest conversations, and that buzz of the Talent community coming together.
As we look ahead, planning for ATC2026 starts now.
Download the new ATC2026 Budget Planner to help make the case, secure funding, and bring your team along next year.
13–14 October 2026 | Fed Square, Melbourne
