You've heard about calibration. Your leadership team mentions it. Your peers run them. But what does one actually look like? See how Confirm handles performance calibration.
If you've never run a calibration session before, the idea can feel vague, maybe even intimidating. You know it's supposed to help you fairly distribute ratings and raises. But how do you actually run one without it turning into a two-hour grind?
Here's the good news. A real calibration session is smaller and faster than you think. This is what one actually looks like, start to finish.
Before the Meeting: 20 Minutes of Prep
You don't walk into a calibration blind. You need three things ready.
First: ratings data. Pull your manager summary data. You want to see the spread: how many people did each manager rate as High Performer, solid contributor, needs improvement, etc.? Print it. Have it in front of you.
Second: manager summaries. Ask your managers to write 1-2 sentences on each employee they're nominating for a raise or special consideration. Not a paragraph. Just context. "Sarah's been picking up security projects and pushing back productively on our tech debt." That's enough.
Third: the room. You need a conference room with a whiteboard or screen everyone can see. Have water. Seriously. Calibration is a mental workout.
Set a timer. You're aiming for 30-45 minutes depending on team size, not a half-day event.
First 5 Minutes: Ground Rules and Scope
Open by saying what you're doing and what you're not doing.
"We're here to align on ratings and promotion readiness. This isn't a referendum on anyone's value. It's a check on whether our ratings make sense as a group. If Susan's manager rated her a 4 and Kevin's manager rated him a 3, and they've done similar work, we need to talk about why."
Then set the rules:
- No surprises in this room. If there's a major issue with someone's performance, that should have been discussed with them already.
- Managers stay out of conversations about their own people. You can listen but you don't vote on your own directs.
- We're looking for consistency, not perfection. We're not trying to rank everyone 1-10 in absolute terms. We're trying to make sure our ratings reflect performance.
That takes five minutes. It sets the tone and prevents the room from turning into a popularity contest.
Next 10 Minutes: Review and Flag
Now you go through the ratings data as a group.
Pull up the spread. "Our managers rated 23 people as High Performer. Let's see who." Go through the list. Does it feel right? Are there obvious gaps? Is someone's top performer doing work similar to someone else's solid contributor?
When something doesn't match, when the data surprises you, stop and dig.
Manager 1: "I rated Sarah a High Performer because she's been driving the analytics overhaul."
Manager 2: "Michael's been doing similar work on the data pipeline, but I rated him solid contributor because he mostly works in his lane."
This is the conversation. "Okay, let's clarify the scope. What's different about Sarah's work?"
Usually the answer is real. Sarah did work more broadly, or Michael just joined the team. Sometimes the answer reveals that you should probably recalibrate someone up.
Don't argue. Just talk through it. Most disagreements dissolve when people actually explain their reasoning.
Flag three to five key inconsistencies. Don't try to resolve everything. Flag and move on.
Last 5 Minutes: Document and Decide
Before everyone leaves, document what you agreed to.
"Our High Performer list is [names]. Promotion candidates are [names]. People who need to improve performance before we consider a raise: [names]."
Write it down. Literally. Someone types it. Everyone confirms it. Boom. You have alignment.
Then: "We're going to communicate the results to your managers this week. These ratings are not surprises; they've been discussed one-on-one. Our job is just to make sure we're applying the same standard."
After: Communicate the Results
Within a week, your managers need to hear what happened.
Send them a summary. "These folks are being promoted. These folks are High Performers and in the raise pool. These folks we need to have performance conversations with."
Don't over-explain. They'll have questions. Take them one-on-one if needed.
Then communicate with employees. High Performers should hear "you're being considered for a raise and we want you here." People who need improvement should have that conversation privately, with context. Not as a surprise from a calibration session.
Why This Works
Calibration isn't complicated because it shouldn't be. You're not trying to build a perfect ranking system. You're trying to spot inconsistencies and make sure your team agrees on who's performing well.
The time pressure actually helps. When you have 20-30 minutes, you focus on the signal, not the noise. You spend time on real disagreements, not debating decimal points.
And you move fast. Your managers don't have to sit through hours of meetings. You get alignment without friction.
Ready to Run One?
If your team is growing and ratings are all over the place, a calibration session is one afternoon that will save you months of frustration.
The structure is simple. The outcome is clear. And it doesn't have to be painful.
Want to see how this works in practice? Confirm makes running calibration sessions faster and more data-driven. You'll have your ratings view pre-built, manager feedback organized, and key inconsistencies flagged automatically. That means you can spend your 20 minutes on what actually matters.
Request a demo and we'll show you how Confirm cuts calibration time in half while actually improving the quality of your decisions.
Ready to Master Calibration?
Whether you're running your first session or your tenth, the fundamentals stay the same: prepare your data, set ground rules, review ratings together, and document your decisions. Confirm helps you do all of this faster.
See how Confirm makes calibration 5x faster. Request a demo today.
If you're looking for calibration software to standardize ratings across your organization, see how Confirm approaches it.
