AI with Michal

One-way video interview

A structured screening format where candidates record answers to preset questions on their own schedule, without a live interviewer present, before the hiring team reviews the clips.

Michal Juhas · Last reviewed May 3, 2026

What is a one-way video interview?

A one-way video interview lets candidates record answers to preset questions on their own schedule, without a live interviewer on the other end. Reviewers watch the clips later, often in batches. The format sits between a written application and a phone screen, and most teams use vendors such as HireVue, Spark Hire, or myInterview to run it.

Illustration: One-way video interview with a candidate recording preset questions on a blank screen and reviewers watching clips in a later queue with a rubric

In practice

  • A recruiter at a scale-up sends 40 candidates a Spark Hire link instead of booking 40 calls. The same four questions, a 90-second limit each, and the team watches together on Thursday afternoon before deciding who moves forward.
  • Candidates in sourcing forums call it "the video thing" or "the pre-screen video." They record it on a phone between other tasks, often in an evening or on a commute.
  • TA leaders pitch the format to hiring managers as a way to see candidates before committing to a live calendar block, which sidesteps the fight over scarce morning slots.

Quick read, then how hiring teams use it

This is for recruiters, sourcers, TA, and HR partners who need the same vocabulary in debriefs, vendor calls, and policy reviews. Skim the first section for a fast shared picture. Use the second when you are deciding how it fits into your ATS and screening process.

Plain-language summary

  • What it means for you: Instead of booking 30 phone screens, you send a link. Candidates record two to four questions on their own time. You and the hiring manager watch later.
  • How you would use it: For early funnel roles where the same questions appear on every call and volume is too high to schedule one by one.
  • How to get started: Write the three questions you ask on every first screen. Add a rubric for each. Pilot on one role where you receive more than 15 applications per week and the job description is stable.
  • When it is a good time: When scheduling is the real bottleneck and you can staff a human review gate within five business days of submission.

When you are running live reqs and tools

  • What it means for you: One-way video is a scheduling trade, not a quality upgrade. You gain throughput; you lose the follow-up question. Pair it with a rubric and a reply SLA or you get faster screening with the same bias patterns.
  • When it is a good time: When hiring managers ask for pre-screen signal and decline to calendar screen calls, or when intake spikes from programmatic advertising or automated outreach.
  • How to use it: Wire the vendor into your ATS so reviewed clips move stages automatically. Keep AI-generated scores off the official record until you have audited them for adverse impact.
  • How to get started: Resolve consent and data retention questions before you invite candidates. Test captions and check the recording flow on mobile at low bandwidth.
  • What to watch for: Completion drop-off after the invite, ghosting post-submission, and automated scoring overlays that legal has not reviewed.

Where we talk about this

Live AI in recruiting sessions at AI with Michal use one-way video as a working case study: where does human review need to stay, what does the rubric need to say, and how do you brief candidates so they trust the process. If your organization is deciding whether to add, replace, or remove this step, bring the real policy constraints to Workshops and work through them with practitioners who have run both sides.

Around the web (opinions and rabbit holes)

Third-party creators move fast. Treat these as starting points, not endorsements, and double-check anything before you wire candidate data.

YouTube

Reddit

Quora

One-way video versus live phone screen

FactorOne-way videoLive phone screen
SchedulingCandidate picks own timeBoth parties must align
Follow-up questionsNot availableAvailable in real time
AccessibilityRequires captions and low-bandwidth optionEasier to accommodate
Bias riskAppearance and vocal cues visible on reviewName and voice cues live
VolumeScales to hundredsBottlenecks at recruiter hours

Related on this site

Frequently asked questions

What exactly is a one-way video interview?
Candidates record answers to preset questions on their own schedule, usually two to four questions with a time limit of 60 to 120 seconds per answer. Vendors like HireVue, Spark Hire, and myInterview deliver the prompts, capture the recording, and return the clips to reviewers. The recruiter watches later, often in batches. There is no live interviewer. The format sits between a written application and a phone screen. Teams use it to handle more volume or to reduce time on calls that cover the same ground every time.
How is a one-way video interview different from async screening?
One-way video is one medium inside the broader category of async screening. Async screening covers any step candidates complete on their own time: typed forms, audio responses, multiple choice, or video clips. The specific choice of video matters for candidate experience, accessibility requirements (captions, low bandwidth, accommodation workflows), and the consent language your legal team needs to approve. When briefing vendors or updating your careers page, treat them as related but distinct. Saying "we do async screening" does not automatically mean you are capturing video and need a biometric or recording consent clause.
When does one-way video beat a live phone screen?
Use it when you have more than 30 screener slots per week on a role, when the same five questions appear on every call, and when hiring managers want to see candidates before committing calendar time. It works poorly when the role warrants follow-up probes, when senior candidates will walk at friction, or when you do not yet have a rubric tied to your scorecard. Without rubric anchors, reviewers fall back on general impression and the kind of gut-feel bias that selection research has flagged for decades. Slow down to build the rubric before you scale.
What do candidates commonly say about the format?
Negative feedback clusters around three issues: technical failures before recording starts, not knowing how responses will be judged, and never hearing back after submitting. Threads on r/recruiting consistently flag ghosting after the video stage as the sharpest trust damage. Counter-moves: publish what you are looking for, set a reply SLA of five to seven business days in the invite email, and send a human confirmation when review is done regardless of outcome. Completion rates improve when candidates see a named person signing the invite, not a generic system address from a vendor domain.
How do you reduce bias when reviewing one-way video responses?
Before the first batch, write scoring anchors for each question tied to your scorecard: three to five bullet points describing what a strong, adequate, or weak answer looks like for that specific role. Calibrate on five test responses with the hiring manager before live reviews start. Review all responses to one question across the cohort before moving to the next, so you are not anchored by a candidate's overall impression. Read auto-captions with audio muted occasionally to check whether you are scoring content or vocal delivery. Log which rubric version was active when reviews ran so you can audit later.
When is AI scoring on one-way video risky?
Some vendors overlay automated scoring on facial expressions, vocal cadence, or transcript sentiment. NYC Local Law 144 requires an annual bias audit if an AI tool is used for employment decisions. The EU AI Act classifies certain hiring AI as high-risk. Beyond legal mandates, expression analysis has poor construct validity: a candidate who pauses to think, speaks quickly, or interviews in a second language can score differently on the same answer. If your vendor surfaces a confidence score or trait match, ask what that signal is built on and whether you can disable the overlay. Pair any automated scoring with human-in-the-loop review before it affects decisions.
Where can we learn and practice with peers?
Bring your current vendor setup or an RFP to a live AI in recruiting workshop and work through rubric design with practitioners who have reviewed real batches. The Starting with AI: the foundations in recruiting course covers async steps alongside scorecards and bias risk so the decisions feel connected, not isolated. Read AI candidate screening with your legal partner before adding automated scoring or changing vendors. Join membership office hours when edge cases around accommodations or GDPR consent come up, because those questions rarely have a clean answer in a help doc.

← Back to AI glossary in practice