Why Most Leadership Development Fails — And What Works Instead

Here’s a number that should bother anyone who invests in leadership development: organizations worldwide spend an estimated $60 billion annually on leadership training and development programs. Yet research consistently shows that most of that investment fails to produce lasting behavior change. Programs underperform, skills don’t transfer back to the workplace, and leaders return to the same patterns within weeks of the offsite wrapping up.

I’ve seen this from both sides. As an HR and talent leader at McKesson, Stanford, T-Mobile, and other organizations, I’ve designed, championed, and evaluated leadership development programs. Some were genuinely effective. Many were not. And after 20+ years of watching what works and what doesn’t, I’ve come to a conclusion that’s uncomfortable for anyone selling two-day workshops: the dominant model of leadership development is structurally flawed. Not because the content is bad, but because the delivery model ignores how adults actually learn and change.

The 70-20-10 Problem

The most useful framework for understanding why leadership development disappoints is the 70-20-10 model, which has been a foundation of organizational learning theory for decades. It describes where development actually happens:

70% from experience — practicing new skills in real situations, taking on stretch assignments, making mistakes, and reflecting on what worked and what didn’t.

20% from relationships — coaching, mentoring, feedback from peers and managers, and learning from watching others lead.

10% from formal learning — training programs, workshops, courses, books, and structured educational content.

Now consider where most organizations put their leadership development dollars: the 10%. They send leaders to a two-day workshop, a week-long executive education program, or an annual leadership retreat. The content is often excellent. The experience is engaging. The participants leave feeling energized and full of good intentions. And within three weeks, almost nothing has changed.

This isn’t a failure of the training. It’s a failure of the model. Formal learning can build awareness and introduce frameworks, but it cannot — by itself — change behavior. Behavior change requires practice, feedback, accountability, and time. It requires the 70% and the 20% that most organizations neglect.

Why the 10% Is So Seductive

If formal training is only 10% of development, why does it consume such a disproportionate share of the budget and attention? Because it’s visible, measurable, and easy to buy.

You can schedule a workshop. You can count attendance. You can distribute satisfaction surveys and report that 92% of participants rated the experience as “excellent.” You can check the box: leadership development? Done. The organization invested. The leaders were trained. If behavior doesn’t change, the implicit assumption is that the individual failed to apply what they learned — not that the model was insufficient.

The 70% and 20%, by contrast, are messy. They happen over months, not days. They’re hard to schedule, hard to measure, and hard to purchase from a vendor. Nobody sells “six months of deliberate practice with structured reflection and accountability.” But that’s what actually changes leaders.

Recent research underscores this. A peer-reviewed framework published in Behavioral Sciences found that workplace application of learning from leadership programs is typically low, and identified 65 evidence-based strategies for improving outcomes — the vast majority of which focus on what happens before, during, and after the formal program, not on the program content itself. The content is usually fine. The transfer system around it is what’s broken.

What Actually Works

The organizations I’ve seen produce real leadership development results share a few common practices. None of them are complicated. All of them require more patience and discipline than a one-time event.

Start with data, not assumptions. Before you invest a dollar in developing a leader, find out what actually needs to be developed. Not “communication skills” or “executive presence” — those are categories, not targets. Use validated assessments to identify specific behavioral patterns. The Hogan suite shows how a leader operates under normal conditions, what happens when they’re under stress, and what drives them. The EQ-i 2.0 maps emotional intelligence across 15 dimensions, revealing exactly where a leader is strong and where they’re getting in their own way. When development starts with data instead of assumptions, the work is targeted from day one. As I wrote in an earlier post on assessment-driven development: assessments serve as mirrors and maps, not verdicts. They give leaders language for patterns they could feel but couldn’t name.

Invest in the 20% — coaching and mentoring. This is where the research is clearest. Managers who receive coaching after formal training show dramatically better work performance than those who receive training alone — some studies report improvement rates as high as 70% compared to training-only cohorts. Coaching provides what workshops cannot: individualized attention to a leader’s specific challenges, real-time problem-solving, accountability for behavior change, and a safe space to be honest about what’s not working. The cadence matters too. Bi-weekly coaching sessions over six months create far more development than a week-long intensive. Not because the intensive lacks value, but because development needs time to metabolize. You try something in a meeting on Tuesday, it doesn’t go well, you process it with your coach on Thursday, and you try a refined approach the following week. That cycle of action, reflection, and adjustment is how adults actually learn.

Design for the 70% — structured experience. The most powerful development happens on the job, but only when the experience is intentional. Stretch assignments, cross-functional projects, temporary leadership roles, and presenting to the board for the first time — these are all development opportunities, but only if someone frames them that way and helps the leader extract learning from them. Without that framing, a stretch assignment is just a stressful week. With it, it’s a data point in a development arc. The leader’s manager plays a critical role here. When a manager says “I’m giving you this project because it’s going to stretch your strategic thinking, and I want to debrief with you after the board presentation,” that’s development. When they say “You’re presenting to the board on Thursday because I’m traveling,” that’s just delegation.

Measure behavior change, not satisfaction. Stop asking whether participants enjoyed the experience and start measuring whether their behavior actually shifted. 360-degree feedback administered before and after a development engagement can quantify changes in how a leader shows up. Direct reports, peers, and managers will tell you whether the leader is giving better feedback, making clearer decisions, managing conflict differently, or demonstrating more self-awareness. That’s the return on investment — not whether the workshop had good catering.

The Compound Effect

Here’s what happens when you combine all three elements — data-driven assessment, structured coaching, and intentional on-the-job application: development compounds. A leader gets their Hogan debrief and discovers that their Cautious scale spikes under pressure, causing them to avoid decisions and wait for perfect information. Their coach helps them design a behavioral experiment: for the next two weeks, they’ll commit to making one decision per day within 24 hours instead of sitting on it. They practice. Some decisions go well; one doesn’t. They process the results with their coach. The pattern becomes visible, manageable, and eventually — over months, not days — it shifts.

That’s not a two-day workshop outcome. That’s a six-month coaching engagement grounded in assessment data and practiced in the real work of leading. It’s harder to schedule, harder to buy, and harder to put in a PowerPoint deck for the board. But it’s the only model I’ve seen consistently produce leaders who are actually different on the other side of the investment.

The data supports this. Research from DDI’s Global Leadership Forecast and others consistently links strong leadership development practices to measurable financial outperformance — in retention, productivity, and revenue growth. And leaders who receive coaching alongside training show performance improvements that training alone never delivers. The return is real — but only when the model matches how change actually happens.

Rethinking the Investment

If you’re a CEO or HR leader evaluating your leadership development spend, ask a different question. Instead of “What program should we send our leaders to?” ask: “What is the specific behavior change we need, and what combination of data, coaching, and experience will produce it?”

The answer will almost certainly involve less event-based training and more sustained, individualized development. It will involve assessment data that reveals specific targets. It will involve coaching that provides accountability and real-time support. And it will involve intentional on-the-job experiences that are framed as development, not just workload.

That’s the model that works. It’s not flashy. It doesn’t produce a great photo for the company newsletter. But it produces leaders who are measurably better at leading — and that’s the only outcome that matters.


Ready to invest in development that actually changes behavior? Our coaching engagements start with validated assessments, build on bi-weekly structured sessions, and focus relentlessly on application. Every session ends with a commitment. Every engagement starts with data. That’s the model.

→ Learn about Executive & Leadership Coaching 

→ Schedule a Discovery Call



Ready to Build a Stronger People Strategy?

Schedule a free 30-minute discovery call to discuss your organization’s people challenges.