Recently, I’ve been interviewing Computer Science students applying for data science and engineering internships with a 4-day turnaround from CV vetting to final decisions. With a small local office of 10 and no in-house HR, hiring managers handle the entire process.

This article reflects on the lessons learned across CV reviews, technical interviews, and post-interview feedback. My goal is to help interviewers and interviewees make this process more meaningful, kind, and productive.

Principles That Guide the Process

The Process Overview

  1. Interview Brief
  2. CV Vetting
  3. 1-Hour Interview
  4. Post-Interview Feedback

A single, well-designed hour can be enough to judge potential and create a positive experience, provided it’s structured around real-world scenarios and mutual respect.

The effectiveness of the tips would depend on company size, rigidity of existing processes, and interviewers’ personality and leadership skills 

Let’s examine each component in more detail to understand how they contribute to a more empathetic and effective interview process.

Photo by Sven Huls on Unsplash

Interview Brief: Set the Tone Early

Link to sanitized version. 

The brief provides:

Brief Snippet: Technical Problem Solving

Exercise 1: Code Review (10-15 min)

Given sample code, comment on its performance characteristics using python/computer science concepts

What signals this exercise provides

  • Familiarity with IDE, filesystem and basic I/O
  • Sense of high performance, scalable code
  • Ability to read and understand code
  • Ability to communicate and explain code

No one likes turning up to a meeting without an agenda, so why offer candidates any less context than we expect from teammates?

Process Design

When evaluating which questions to ask, well-designed ones should leave plenty of room for expanding the depth of the discussion. Interviewers can show empathy by providing clear guidance on expectations. For instance, sharing exercise-specific evaluation criteria (which I refer to as “Signals” in the brief) allows candidates to explore beyond the basics.

Code or no code

Whether I include pre-written code or expect the candidate to write depends on the time available. I typically reveal it at the start of each task to save time ,  especially since LLMs can often generate the code, as long as the candidate demonstrates the right thinking.

CV Vetting: Signal vs Noise

You can’t verify every claim on a CV, but you can look for strong signals 

Git Introspection

One trick is to run git log — oneline — graph — author=gitgithan — date=short — pretty=format:”%h %ad %s” to see all the commits authored by a particular contributor. 

You can see what type of work it is (feature, refactoring, testing, documentation), and how clear the commit messages are.

Strong signals 

Weak or Misleading signals

Photo by Patrick Fore on Unsplash

Interview: Uncovering Mindsets

Reflecting on the Interview Brief

I begin by asking for thoughts on the Interview Brief.

This has a few benefits:

Favorite Behavioral Question

To uncover essential qualities beyond technical skills, I find the following behavioral question particularly revealing

Can you describe a time when you saw something that wasn’t working well and advocated for an improvement?

This question reveals a range of desirable traits:

Effective Interviewee Behaviours (Behavioural Section)

  1. Attuned to both personal behavior and both its effect on, and how it’s affected by others
  2. Demonstrates the ability to overcome motivation challenges and inspire others
  3. Provides concise, inverted pyramid answers that uniquely connect to personal values

Ineffective Interviewee Behaviours (Behavioural Section)

  1. Offers lengthy preambles about general situations before sharing personal insights

Tips for Interviewers (Behavioural Section)
I’ve never been a fan of questions focused on interpersonal conflicts, as many people tend to avoid confrontation by becoming passive (e.g., not responding or mentally disengaging) rather than confronting the issue directly. These questions also often disadvantage candidates with less formal work experience.

A helpful approach is to jog their memory by referencing group experiences listed on their CV and suggesting potential scenarios that could be useful for discussion.

Providing instant feedback after their answers is also valuable, allowing candidates to note which stories are worth refining for future interviews.

Technical Problem Solving: Show Thinking, Not Just Results

Measure Potential, Not Just Preparedness

Effective Interviewee Behaviours (Technical Section)

  1. Exercise 1 on File reading with generators: admitting upfront their unfamiliarity with yield syntax invites the interviewer to hint that it’s not important
  2. Exercise 2 on data cleaning after JOIN: caring about data lineage, constraints of the domain (units, collection instrument) shows systems thinking and a drive to fix the root cause

Ineffective Interviewee Behaviours (Technical Section)

  1. Remains silent when facing challenges instead of seeking clarification
  2. Fails to connect new concepts with prior knowledge 
  3. Calls in from noisy, visually distracting environments, thus creating friction on top of existing challenges like accents.

Tips for Interviewers (Technical Section)

  1. Start with guiding questions that explore high-level considerations before narrowing down. This helps candidates anchor their reasoning in principles rather than trivia.
  2. Avoid overvaluing your own prepared “correct answers.” The goal isn’t to test memory, but to observe reasoning.
  3. Withhold judgment in the moment ,  especially when the candidate explores a tangential but thoughtful direction. Let them follow their thought process uninterrupted. This builds confidence and reveals how they navigate ambiguity.
  4. Use curiosity as your primary lens. Ask yourself, “What is this candidate trying to show me?” rather than “Did they get it right?”
Photo by Brad Switzer on Unsplash

LLM: A Window into Learning Styles

Modern technical interviews should reflect the reality of tool-assisted development. I encouraged candidates to use LLMs — not as shortcuts, but as legitimate creation tools. Restricting them only creates an artificial environment, divorced from real-world workflows.

More importantly, how candidates used LLMs during coding exercises revealed their learning preferences (learning-optimized vs. task-optimized) and problem-solving styles (explore vs. exploit).

You can think of these 2 dichotomies as sides of the same coin:

Learning-Optimized vs. Task-Optimized (Goals and Principles)

Explore vs. Exploit (How it’s done)

4 styles of prompting

In Exercise 2, I deleted a file.seek(0) line, causing pandas.read_csv() to raise EmptyDataError: No columns to parse from file

Candidates prompted LLMs in 4 styles:

  1. Paste error message only
  2. Paste error message and erroring line from source code
  3. Paste error message and full source code
  4. Paste full traceback and full source code

My interpretations

Those who choose (1) start looking at a problem from the highest level before deciding where to go. They consider that the error may not even be in the source code, but the environment or elsewhere (See Why Code Rusts in reference). They optimize for learning rather than fixing the error immediately. 

Those with poor code reproduction discipline and do (4) may not learn as much as (1), because they can’t see the error again after fixing it.

My ideal is (4) for speedy fixes, but taking good notes along the way so the root cause is understood, and come away with sharper debugging instincts.

Red Flag: Misplaced Focus on Traceback Line

Even though (2) included more detail in the prompt than (1), more isn’t always better.
In fact, (2) raised a concern: it suggested the candidate believed the line highlighted in the Traceback ( — -> 44 df_a_loaded = pd.read_csv) was the actual cause of the error. 

In reality, the root cause could lie much earlier in the execution, potentially in a different file altogether.

Prompt Efficiency Matters

After Step (2), the LLM returned three suggested fixes — only the third one was correct. The candidate spent time exploring Fix #1, which wasn’t related to the bug at all. However, this exploration did uncover other quirks I had embedded in the code (NaNs sprinkled across the joined result from misaligned timestamps as the joining key)

Had the candidate instead used a prompt like in Step (3) or (4), the LLM would’ve provided a single, accurate fix, along with a deeper explanation directly tied to the file cursor issue.

Style vs Flow

Some candidates added pleasantries and extra instructions to their prompts, rather than just pasting the relevant code and error message. While this is partly a matter of style, it can disrupt the session’s flow ,  especially under time constraints or with slower typing ,  delaying the solution.

There’s also an environmental cost.

Photo by Anastasia Petrova on Unsplash

Feedback: The Real Cover Letter

After each interview, I asked candidates to write reflections on:

This is far more useful than cover letters, which are built on asymmetric information, vague expectations, and GPT-generated fluff.

Here’s an example from the offered candidate.

Excelling in this area builds confidence that colleagues can provide candid, high-quality feedback to help each other address blind spots. It also signals the likelihood that someone will take initiative in tasks like documenting processes, writing thorough meeting minutes, and volunteering for brown bag presentations.

Effective Interviewee Behaviours (Feedback Section)

  1. Communicates expected completion times and follows through with timely submissions.
  2. Formats responses with clear structure — using paragraph spacing, headers, bold/italics, and nested lists — to enhance readability.
  3. Reflects on specific interview moments by drawing lessons from good notes or memory.
  4. Recognizes and adapts existing thinking patterns or habits through meta-cognition

Ineffective Interviewee Behaviours (Feedback Section)

  1. Submits unstructured walls of text without a clear thesis or logical flow
  2. Fixates solely on technical gaps while ignoring behavioural weaknesses.

Tips for Interviewers (Feedback Section)

  1. Live feedback during the interview was time-constrained, so give written feedback after the interview about how they could have improved in each section, with learning resources
    – If done independently from the interviewee’s feedback, and it turns out the observations match, that’s a strong signal of alignment 
    – It’s an act of goodwill towards unsuccessful candidates, a building of the company brand, and an opportunity for lifelong collaboration

Carrying It Forward: Actions That Matter

For Interviewers

  1. Develop observation and facilitation skills
  2. Provide actionable, empathetic feedback
  3. Remember: your influence could shape someone’s career for decades

For Interviewees

  1. Make the most of the limited information you have, but try to seek more
  2. Be curious, prepared, and reflective to learn from each opportunity

People will forget what you said, people will forget what you did, but people will never forget how you made them feel – Maya Angelou

As interviewers, our job isn’t just to assess — it’s to reveal. Not just whether someone passes, but what they’re capable of becoming.

At its best, empathetic interviewing isn’t a gate — it’s a bridge. A bridge to mutual understanding, respect, and possibly, a long-term partnership grounded not just in technical skills, but in human potential beyond the code.

The interview isn’t just a filter — it’s a mirror. The interview reflects who we are. Our questions, our feedback, our presence — they signal the culture we’re building, and the kind of teammates we strive to be.

Let’s raise the bar on both sides of the table. Kindly, thoughtfully, and together.

Photo by Shane Rounce on Unsplash

If you’re also a hiring manager passionate about designing meaningful interviews, let’s connect on LinkedIn (https://www.linkedin.com/in/hanqi91/).

I’d be happy to share more about the exercises I prepared.

Resources

  1. Writing useful commit messages: https://refactoringenglish.com/chapters/commit-messages/
  2. Writing impactful proposals: https://www.amazon.sg/Pyramid-Principle-Logic-Writing-Thinking/dp/0273710516
  3. http://highagency.com/
  4. Glue work: https://www.noidea.dog/glue
  5. The Missing Readme: https://www.amazon.sg/dp/1718501838
  6. Why Code Rusts: https://www.tdda.info/why-code-rusts

The post Beyond the Code: Unconventional Lessons from Empathetic Interviewing appeared first on Towards Data Science.