When I spend money, I expect to see results. So when our LinkedIn ads campaign hit $649 in spend with zero qualified leads to show for it, I sat down with our team to figure out what went wrong. See how Confirm handles performance management.
Here's what happened, what we learned, and how we're fixing it.
The Setup
In early March 2026, we decided to test LinkedIn ads for Confirm, our performance management software. The plan was straightforward: target HR leaders at mid-market companies (100-2000 employees) with messaging around continuous performance management.
We created three ad variations:
- One focused on reducing performance review cycle time
- One on continuous feedback instead of annual reviews
- One on the cost of bad performance reviews
We set a daily budget of $30, targeted people with HR titles at the company sizes we care about, and let it run for about three weeks.
The result: 3,247 impressions, 47 clicks, $13.80 CPC, and zero leads. Not even an email signup from someone interested in a demo.
This is a failure. Here's why it happened.
Reason 1: Our Targeting Was Too Broad
We targeted everyone with an HR title. That's a huge range: HR Coordinators, HR Admins, HR Managers, VPs of People, Chief People Officers. These are completely different personas with completely different buying signals.
We should have started with VPs of People and above. These are the people who actually decide whether to invest in new performance management software. Instead, we cast a wide net and got a lot of noise.
Lesson: Test audience narrowing first. Target job titles most likely to have budget authority and pain around your specific problem. We were targeting "anyone in HR," not "the person who decides if we buy new HR software."
Reason 2: The Ads Themselves Were Weak
Looking back at the copy, our ads focused on product features (continuous feedback, eliminate annual reviews) instead of the problem they solve.
An HR Manager dealing with a broken performance review process doesn't want to hear about continuous feedback. She wants to know: Will this save me 20 hours a month? Can this reduce turnover? Will my CEO actually use this?
We talked about the thing, not the outcome. We made the mistake of assuming everyone in HR cares about the same thing. They don't.
Lesson: Test messaging against the actual problem the buyer is trying to solve. Start with one specific persona and one specific outcome. Don't try to be everything to everyone.
Reason 3: No Clear Next Step
Our ads linked to our homepage. Not a landing page specific to the ad. Not a demo signup. The homepage.
This is a classic funnel leak. We drove people to a general page instead of to the next step in the buying process. If someone clicked an ad about continuous performance reviews, they should land on a page about continuous performance reviews with a clear call to action. Not on a homepage where they have to hunt for what we're selling.
Lesson: Drive to the specific conversation, not the general brand. Create landing pages that match the ad promise and lead directly to the next step (demo signup, free trial, call with sales).
Reason 4: We Didn't Have a System to Track Them
Even if someone had clicked through, signed up, and entered our funnel, we had no way to attribute them back to the LinkedIn campaign. No UTM parameters on the links. No distinct landing pages. Nothing.
A lead from a LinkedIn ad that signs up on our homepage looks identical to an organic visitor. You can't measure what you can't track.
Lesson: Set up tracking before you launch. Use UTM parameters. Create campaign-specific landing pages. Have your sales team tag inbound leads by source. You need to know where your leads come from to optimize your spending.
Reason 5: We Didn't Test Small
We spent $649 before we had any evidence that LinkedIn was the right channel for us, that our messaging was working, or that our targeting was sound.
A better approach: Spend $100-200, measure results against that small spend, and only scale if you see the leading indicators you need (click-through rate, cost per click, landing page conversion rate). Then use those metrics to decide whether to spend more.
We skipped that step. We went to $30/day and just... waited.
Lesson: Test in phases. Spend small to learn fast. Only scale a channel once you've confirmed the fundamentals work (targeting is right, messaging resonates, landing pages convert).
What We're Fixing
We've already started making changes:
-
We're creating a dedicated landing page for each persona (for CFOs, for People Leaders, for HR Managers). Each page will address the specific problem that persona faces and include a clear call to action.
-
We're tightening our targeting. Instead of "anyone in HR," we're starting with People Leaders at companies with 200-1000 employees. We'll test that first, measure results, then expand if it works.
-
We're writing new ad copy that leads with the outcome, not the feature. Instead of "continuous feedback," we're testing "Close performance gaps 40% faster."
-
We're setting up tracking. Every ad link has UTM parameters. Every landing page has a dedicated form. Every conversion is tagged in our CRM so we know where it came from.
-
We're committing to a test budget. We're spending $300 for the next 30 days on a single, narrow test. If our new approach produces leads at a reasonable cost, we'll scale. If it doesn't, we'll pause and try something else.
Why I'm Sharing This
Most companies don't talk about their failed ad spend. You see the wins and the case studies, not the $649 experiments that go nowhere.
But there's value in sharing what didn't work. If you're running ads to B2B buyers, these are the same mistakes I made. Targeting too broad. Messaging about the feature instead of the outcome. No tracking. No testing discipline.
The good news: They're all fixable. None of these are signs that LinkedIn ads don't work for us. They're signs that we weren't using LinkedIn ads correctly.
We'll report back once we know whether our new approach works. But in the meantime, if you're running ads and seeing similar results or seeing no results at all, look at these five things. One of them is probably the issue.
