aging partner at OnPoint Consulting LLC, based in
New York City.
If possible, use data from the performance management
system or conduct 360-degree assessments, gathering
feedback from multiple sources, to help determine the
development areas that are a high priority for leaders or
that high-potential employees need to focus on. Or, if cost
is a factor, use an anonymous survey to ask leaders directly.
Study the aggregate data for trends to uncover competency
gaps throughout the organization or division, she advises.
Be aware that just because leaders aren’t using a particular
skill in their current role doesn’t mean they don’t know how
to leverage that ability.
“Focusing on what behaviors or competencies are going
to be required in the future also gives organizations a better return on their investment,” DeRosa says.
In a recent study of one large organization, Phillips
notes, leaders said they preferred experiential learning.
“They want projects that achieve a purpose that also give
them an opportunity to develop as leaders,” she says.
Build data collection and evaluation into the process
early on. They shouldn’t be an afterthought, Phillips says.
Depending on your objectives and the type of program
you’ve selected, you might choose to collect data that will
help you answer the following questions:
ā ā Do participants believe the program was relevant? Most
companies conduct surveys after training programs to
collect participant feed-
back. You can also cap-
ture feedback throughout
the program’s implemen-
tation to indicate the ex-
tent to which participants
have bought into the
content or the likelihood
that they will apply what
they’ve learned, Phillips
says. However, surveys
don’t gauge behavioral
change, DeRosa notes. For that, you’ll have to do more.
ā ā How much did they learn? Assessments can be given to
test the knowledge and insights acquired. Simulations
and role-playing can also provide evidence of learning.
ā ā Are they applying what they learned to their jobs? To
measure that, you could collect more 360-degree feed-
back after the program. But wait at least nine months,
especially if the individual is going through some on-the-
job experience or training, to allow time for any changes
to take efect, DeRosa advises.
ā ā What impact did the program have on the business?
Improvement in measures representing output, quality,
costs, time, job satisfaction, customer satisfaction, work
habits and innovation is the ultimate goal, Phillips says.
DeRosa suggests also looking at retention fgures and
whether participants were promoted.
ā ā What is the return on investment (ROI)? Calculate the
monetary value for the changes in business impact. Sub-
tract the costs, both direct and indirect. The net benefts
divided by costs will give you the ROI.
ā ā What are the intangibles? While they can be converted
to money and included in the ROI calculation, typically
the cost of doing so outweighs the benefts. If improve-
ments can be shown in teamwork, inclusion, experience
and others, which are directly linked to leadership devel-
opment, the value is clear enough.
Share your expectations upfront. Don’t surprise
participants by asking them to provide data after the
fact. They should know what you need in advance and be
given tools to help keep track of their progress. Let them
know they are expected not only to change their behavior but also to use their new knowledge to drive business
Share the results. Whatever your fndings, share them
with business leaders. While it might be tempting to hide
negative results, hoping no one will remember to ask for
the analysis, that strategy won’t likely end well.
At a minimum, you can use your evaluation to make
improvements for the future. A negative ROI could
indicate that you spent too much money on the program,
for example, or that you involved people who didn’t need
to be included. However, you might also see an uptick in
“A negative ROI doesn’t mean you have to stop the program. It does mean you need to improve it in some way,”
While everything worth doing can—and should—be
measured, it might not be practical to evaluate some inexpensive programs so extensively, Phillips says. On the other
hand, if you’re involving people in a comprehensive process
because you’re trying to drive change or improve key business metrics, you will want to evaluate your program for
business impact and ROI.
“In the end, that’s what senior executives want to see—a
connection between these expensive programs and improvement in these key business measures,” Phillips says.
“Otherwise, why do it?”
Dori Meinert is senior writer/editor at HR Magazine.
‘People think that investing in
leadership development is inherently
a good thing. While there may be
some truth to that … the first
question to address is why.’