Securing Genuine Alignment

Much has been written in the HR and Change Management community on the subject of alignment- and rightly so. A vexing issue to say the least, alignment (or lack thereof) is one of the most common reasons (excuses?) for not meeting performance targets. So it’s only fitting that we at PMW spend some time exploring the issue of alignment from a Performance Management perspective.

Why is it that alignment is so elusive? Is alignment something we can manage, or is it something intangible like culture that just develops over the years? Is creating alignment a people process or an operational process? All good questions, but questions that cannot be answered without looking more deeply at what alignment looks like as it is being developed, to the time that it is ultimately achieved.

Let’s start with what alignment is. Alignment, in its purest form, is a shared COMMITMENT to producing an outcome and the strategy through which that outcome will be achieved. It is a DECLARATION of ownership by EACH INDIVIDUAL team member, and a PROMISE to do their part in achieving that outcome. Most importantly, alignment is a CHOICE that a member of a team brings himself to after understanding and EMBODYING the desired outcome and strategy. People can, and often do, disagree with parts of a solution, but can still remain aligned with the LARGER PURPOSE or CAUSE. Hence, they are usually able to say what is missing for them to come back into alignment. Stated simply, alignment is the HIGHEST level of commitment that can be observed in groups- much larger and more powerful than agreement, acceptance, “buy in”, or any other type of organizational consensus that may be achieved.

Sounds pretty straightforward, until you look at the PROCESS by which genuine alignment is created. In contrast with the process of consensus or compliance building, the process of creating alignment is markedly different. Take a team whose leader is personally sold on a solution and simply wants to gain compliance to his strategy. That process can be a complex negotiation, or a simple mandate- but whichever path is chosen, it will likely result in a “compromised solution”- either a “watered down” version of the outcome, or a “watered down” level of commitment. Achieving that kind of consensus can be purely a people process- taking a group and leading them to the water, and hopefully getting them to take a sip. But it is nothing like the process of achieving genuine alignment around a bold vision and strategy observed in most leading edge organizations.

A genuinely aligned team looks very different. The commitments are not only bold and unwavering, but art fully EMBODIED in the individuals who set out on the journey to achieve the vision. Think about your favorite sports team when everything seems to click. Players are in the right spots, appearing to almost read each other’s minds. They know each other’s tendencies and always seem to be one step ahead of the game. That’s real alignment. And that’s something you can’t teach, instruct or demand… as it is a commitment that is built within the individuals themselves. Genuine alignment is integrated into the fabric of a business- from the mission of the team to the goals of each individual, to the plan that is put in place to achieve it. From there, it becomes an integral part of each individual’s roles and accountabilities from start to finish.

What are the key ingredients necessary in building this kind of alignment? Here are a few “common denominators” you’ll see in a well aligned team as it is being formed:

Built on a BOLD VISION- Groups cannot be aligned if you’re business processes sit on a weak foundation. That is, a large organization cannot create real alignment around small tactical initiatives like “grow revenue by 5% per year” or cut expenses by 10%. A bold vision stretches the imagination into a world that looks radically different (and better) to the team that will take you there. Think about the visions of our early pioneers, forefathers, and activist leaders. Whether you believe in their cause or not, most would admit that their visions were inspiring. Columbus, Washington, Jefferson, JFK, MLK, Reagan- all laid out inspiring visions to their following- many of which inspired their following to put their life on the line to achieve it. People get aligned around a “CAUSE”, not a budget goal. Find out how to turn your vision, mission, and business objectives into a “bold cause”, and you’ll get a lot closer to your desired levels of alignment.

A Story about the Future based on GROUNDED ASSESSMENTS of the past and present- If you’ve got a bold vision, and it’s based on changing a current “reality”, you’d better be good at your assessments of the past and present. That means when you lay out your case for change and your new vision, it needs to be based on an accurate and defendable assessment of current state- based on FACT- not feelings or opinions. Feelings and emotions might convince someone to follow you, but it won’t get them to “own” the outcome for themselves. People are smarter than that. In order to step out on the end of the plank, each member of the team needs to be sure that the risk is worth taking. For that reason, bulletproofing your assessments is a must in the early stages of alignment building.

Ability to WITHSTAND CHALLENGES and necessary course corrections throughout the journey- Most of the time alignment doesn’t just happen in one step, in fact it rarely ever does. Alignment building is iterative and continuous- from the start of the journey to the end. One of the accepted realities of alignment is that people can often come in and out of it, as conditions change. What’s different about an aligned team is that team members, once initially aligned, are able to see and declare for themselves that they have fallen out of alignment on one or more aspects of the roadmap. So, from time to time, you may (and should expect to) get healthy challenges and questions about the path you’re on. Your answers and responses will be the keys to bringing those team members back into alignment. A plan that is airtight and defendable at the outset of the journey may develop problems as conditions change. A team that is genuinely aligned will be able to handle the types of challenges and course corrections that may be necessary during the journey, without risking the integrity of the outcome.

TRANSPARENCY of INFORMATION- Rarely does alignment work in a “closed book environment”. “Do it because I said so” and “trust us on this” are fine for declarations and compliance building, but won’t get you anywhere on building ownership for a commitment, and a personal promise to execute it. If the above two tenets of alignment are real, then the data environment needs to support it. Optimally, the data environment should be conducive to questioning and learning. If your assessments are grounded and defendable, then there is only positive that can come out of sharing that data with your team openly and honestly. Closed systems will surely stifle progress toward genuine alignment.

ACCOUNTABILITY “Through and Through”- Embodied. Integrated. Embedded— Getting commitment woven into the fabric of your business processes is not possible until the commitment is part of every team member’s personal goals and reward system. If the bold ambition declared at the top, is not seriously connected to an individual’s performance contract, there will exist a big alignment gap that will be virtually impossible to fill. Performance contracts and reward systems are what documents and connects the individual’s commitment to the broader ambition of the team. It is imperative that these ends of the spectrum get and stay connected.

So there you have it- a quick checklist to ensure you are on the path to an aligned and high performing team. While creating alignment is anything but “quick”, focusing on these items can make the process a lot faster and less painful than it needs to be. And the resulting alignment will a lot stronger too!

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com

 

The Missing Ingredient in Change

Over the last few weeks, a few readers have asked me to comment on “how a good performance management methodology plays into some of the newer, and more popular change initiatives” at play in their companies- “whether it be six sigma, lean, value centric CRM”, or any of the other major initiatives at play today inside our organizations.

First, I’d be remiss if I didn’t say, for starters, that performance management is NOT a change methodology in and of itself. Rather, it is a discipline- timeless in terms of its applicability, and blind of any bias in change methodology.

For years, we at ePGI have preached that Performance Management sits at the center of change. While it may appear a little self-serving for an organization providing PM solutions, few companies that have successfully embraced these newer methodologies would argue with the importance of performance management in their overall journey.

PM, in its most basic form, is a process of measurement, diagnosis, and reporting that accompanies the journey of change. PM serves as an organizational gauge, which measures both the progress and quality of change. Think of it like a pilot thinks of his altimeter, air speed and other key indicators central to air flight. No matter what model of aircraft a pilot chooses to fly from point A to point B- be it a single engine Cessna or a fancy new G4 cross continent jet- the basic elements of flying remain the same. The success of a flight depends on how well a pilot manages these critical indicators, and the supplemental diagnostic data that is available to the pilot on demand.

Case in point: I once sat next to a 747 pilot who described what the pilot was doing- play by play- as we made our approach into Sydney Australia. What he described was not what a typical passenger would think given all the dials, gauges, and fancy displays visible to passengers as they peek into the cockpit during boarding. Instead, what he described was very focused and deliberate- concentration on a handful of key indicators, with detailed drill downs available should something fall outside of “normal control limits”.

Long before balanced scorecard initiatives, six sigma programs, lean manufacturing methodologies…and the myriad of other efficiency and quality solutions that have come on the scene in recent years- Performance Management was the mainstay for any organization that was worth its salt. What the newer and more popular change methodologies have brought to the scene are faster, better, and more efficient processes to create and manage change. No doubt about that. But no matter which methodologies you choose to embrace, you’ll never reach a productive destination without a good performance management program. Performance Management is THE common denominator, central to any effective change program.

The great irony of performance management is that despite its importance in everything an organization does, it is perhaps the simplest of processes to get your arms around and master. And while organizations spend millions to train, educate, and master new and emerging change techniques, many still fail to spend the comparatively smaller percentage of time required to establish a good performance management foundation that will likely make or break the resulting ROI.

Simply stated, the PM discipline is really about providing the information and analysis required in effectively managing people and processes. For example, we’ve all been schooled with the age old- “Plan/ Do/ Check/ Adjust” method for managing a particular function, process, or organization. Without an effective and clear process for measuring and analyzing performance, the execution of each of these steps would be severely impeded.

Now, take something like Lean Six Sigma, for example- one of these more recent and popular methodologies for identifying and capturing performance improvement. At its very core is an acronym called DMAIC- Define, Measure, Analyze, Improve, Control- a technique leaders in the six sigma discipline call a “structured data-based problem solving methodology”. Sound a little familiar? DMAIC, while hard to argue with, is really not too different from what successful organizations saw in year’s prior. Are the newer methodologies, better, and more rigorous? Absolutely. But at their core are still the fundamentals of a good performance management discipline.

My intention with this comparison is NOT to criticize companies who have sworn to follow a particular improvement methodology- in this case, the six sigma following. Rather, what I am trying to illustrate is that without a solid performance management foundation– good measurement techniques, good analysis and diagnostic practices, good goal setting procedures, and good tracking and reporting processes—few, if any of these approaches will achieve their desired outcome in terms of cost savings, quality improvement, or process speed and efficiency.

So as we embrace the new principles and techniques of these new change methodologies, let’s be careful to not overlook the simpler, and more important PM processes that are central to yielding the benefits these approaches promise. To use a sports analogy, performance management is really about good “blocking and tackling”. Gameplans and strategies can and will vary from competitor to competitor. New gameplans will emerge. New “gadget plays” will be introduced that will change the complexion of games to come. But without the fundamentals of blocking and tackling, few if any of those gameplans would achieve their intended outcome.

Such is the case with performance improvement. As we navigate our change initiatives, lets make sure we put the appropriate emphasis and resources on the PM fundamentals. The extra benefits that accrue will not only serve you well within your current improvement programs, but within the next generation of initiatives that are on your future horizon.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com

 

Don’t Get “Stuck on Stupid!”

Whatever your political bent, or your view of the American media, you’ve got to love the recent comments of Lieutenant General Russell Honore during the Katrina aftermath.

When interrogated by reporters about Katrina-related mistakes and miscues, during the immediate aftermath of Katrina and the pending arrival of Rita, the Lt. General fired back with one of the best “in your face” rebuttals in media history. “You guys are STUCK ON STUPID!”, he said, “…and I’m not going to answer those questions!” Then, as only great leaders can do, he shifted the attention to what could be done NOW… going forward. In one short phrase, he showed the insanity of a backward looking fixation in a time of crisis, and the importance of quickly learning from mistakes and moving on. If only we could instill that kind of thinking into our organizations and personal lives.

We, as a culture, waste a lot of time fixated on the past. This is a tricky topic, because in order to learn, we have to be able to look backwards. I don’t believe the Lt. General meant to suggest we not look backward. Rather, I believe, he intended to show us the art of WHEN and HOW we should look back.

Here are a few of my observations about backward-looking actions, and where that line exists between effective diagnosis and what the good General would call a “stuck-on-stupid” culture:
———————————–
1. When (and WHEN NOT TO) look backwards- the theme I believe was most central to the Lt. General’s comments was this: There is a time and a place for a backward looking assessment.


In a football game, assessments occur at various intervals- half-time, end of quarters, during time outs, in the huddle, and sometimes even right before the play during a “check off” at the line of scrimmage. But assessments and questions about fault or blame NEVER occur DURING the play. The few seconds it takes for the play to unfold is about execution only. How stupid it would appear if one of the sports reporters walked onto the field and began questioning the coaches and players in the middle of a particular play. In sports, we see that kind of on-the-field interference as unacceptable, but in other crisis situations (like Katrina), we don’t think twice about the appropriateness of it.

In business it’s even worse. We have management agendas, advisors and consultants, board politics, and a myriad of other factors all screaming their opinion about how the play should unfold. Let’s take a lesson from our sports brethren, and save those assessments for AFTER the play is run. There’s nothing wrong with good assessment. But let’s save them for a time when they’ll have real impact instead of being seen (appropriately) as a distraction.

One more quick analogy on when and how often we should look backwards. Think of the last time you drove a car. How much of the total time would you say you looked in the rear-view mirror. Most driving instructors will tell you that you should look up into the rear-view mirror about once every six seconds. That translates to about 15% …probably not too unreasonable a number to shoot for in the workplace.
———————————-
2. Are our comments focused on specific behaviors or root cause? A lot can be observed by the questions we ask during a review of a failed strategy or play.

There is a great story that is told about a man who walks down a street and falls into a deep hole. He does the same thing each morning, with each day producing little or no real insight. The first few days are spent asking “why me?” type questions. The next few days are spent getting out of the hole quicker and more efficiently. The next few days, he walks around the hole. It’s not until the last day that the man decides to take a different route altogether, eliminating his risk of falling into the hole entirely. For many days, we might say this man was “stuck on stupid”. But he finally learned to ask the right questions, and only then was he able to solve his problem.
———————————-
3. The “SO WHAT” Test- Early in my career, I had a boss that would frequently add the margin comment “So What?” to his review of various letters and reports written by his staff. It was his way of saying, “OK I hear you… and I get your point, but what is the implication, or conclusion I should draw ?”.

I’ve since applied this principle to much of what I do in business and life, and I believe this was one of the Lt. General’s key messages in his “stuck on stupid” rant. Assessments are great, as long as they lead to new learnings, AND a new way of doing business. Most of the time, if timed right, good assessments will lead to changed strategies or actions. But there are many cases (and you see them everyday) where the main purpose of an assessment is to assign blame or channel criticism. It’s those cases where the assessment is better left alone, at least temporarily. Again, you can always come back to it later after the play is run, or the game at hand is over.
———————————-
4. Setting a new bar (measure the future not the past)- One way to get “stuck on stupid” is to keep hammering away at a measure of metric that has failed you more than once. If that’s the case, its time to either change your approach to the problem, change the measure, or both.

On first blush, you might say that changing the measure seems to be taking our eye off the ball, or conforming the metric to fit your situation. But in years of studying performance, I’ve found that repeated failures typically mean that you’re not sending the right signals. That is, often you’re tracking something that is too distant from an individual or team’s accountability area.

Last week, I played in a “scramble” format golf tournament in which each player hits a shot, and the team selects the best of those shots from which to progress. Our team was composed of a long hitter (driver), approach man (for mid range shots), an “up and down” guy (for greenside shots), and a good putter. Each one of us excelled in a particular area. We’ve played these kind of tournaments many times before. But this time, we tried something different. We decided to assign goals for each category of performance, so that for example, the driver was responsible for # of fairways hit, the approach guy was responsible for greens hit in regulation, and so on. The impact on our collective performance was significant and noticeable (I wont tell you our net score but I will say it was a notable improvement), and far better than the occasions in which we focused only on the total score.
———————————-
5. Avoid the blame game / Reward (vs. punish) failures- this one is related to, but a bit different from #3 above, in that it deals with how you treat and reward accountable individuals.

In all of our organizations, we have those individuals who try new things, embrace change, and have a real bias toward action. Sometimes, improvisation is necessary, especially if the situation is very dynamic. And it’s in those cases where you need to reward quick decision making based on grounded assessments and learning.

There was an old adage years ago called “Go Ugly Early (and Often)”. Give me someone who learns and implements change quickly, versus someone who gets “stuck” in analysis of past performance. Looking back is good, but you’ve got to reward those who can also look forward and ACT. To me this is the essence of the Lt. General’s comments.
———————————-

Let’s face it, there’s something about the word STUPID that gets our attention. We saw it in Clinton’s campaign with the catch phrase “It’s the economy STUPID”. And while we scold our children for calling someone Stupid, none of us wants to be viewed that way. Why do you think we play the blame game so much? It’s all an attempt to not be viewed by our peers as the one who “dropped the ball”.

What we don’t always see, however, is that it is just as (if not more) stupid to “lock in” on failures and analysis of those failures without a corresponding focus on the timing of our assessments, the changes that need to result, and the speed with which we can then move on.

Let’s hand it to Mr. Honore for calling it as he saw it, and getting all of us motivated on what the future holds, rather than getting hung up on our past failures.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Are our Strategic Plans Selling us Short ?

This morning in the Northeast, we had another taste of Fall in the air. I’m sure it was just a fake-out, as the temp climbed back up into the upper 70’s. I know our friends down south, particularly those in the hot and humid “Big Easy”, see that as early winter. For the rest of us, it was a nice departure from the sweltering dog days of summer.

Being a native of the south, one of the things I like about the Northeast is the change in seasons. Not only do I like the change in weather patterns, but all of the other signs of change that come with it. Pretty soon, the leaves will be changing, the days will be cooler, and before long, the holidays will be in sight. I think all of us, deep inside, like the change that occurs in the seasons. And as much as we probably hate to admit it sometimes, some of us actually like change itself.

For performance managers, the change in seasons- particularly summer into fall- also means our lives are about to become pretty hectic. I like to think of it as our “tax season”. We start thinking about updating our strategic plan, preparing initiatives for the next year, and getting started on that old dreaded budget. And while we don’t look forward to the long hours that accompany it, this period of the year actually gives some of us the renewal we need to keep pressing ahead.

One of the things that has always fascinated me is the dichotomy that exists between planning and change…- a contradiction, if you will, between change (which by its nature is dynamic), and strategic plans (which have historically been rather static, at least from the standpoint of the plan that results)

Historically, planning was thought of as an activity that was designed to reduce uncertainty, or “manage change”, if you will. Of course, our plans have all the basics- vision, mission, objectives, strategies, initiatives, tactics, financial projections, operational implications, performance measures, targets, and implementation plans. For organizations of our size, that’s a lot of STUFF…so it should be no surprise that many of our organizations take most of the fall and early winter to build, refine, or update our plans- both long and short term. And by December or early January, we have a work product that is based on extensive analysis of all that stuff- THE PLAN- which is often memorialized in a physical document (the proverbial “planning binder”, and the even more important “almighty budget”).

There is nothing inherently wrong with preparing a plan and using it to guide the organization forward. Memorializing our vision, strategies, and tactics is not only necessary, but is by all accounts a good thing to do. But its what you do with that plan, and how you use it, that can make all the difference in the world.

Unfortunately, many organizations allow their plan to get “fixed” into the culture of the business. The plan remains on the bookshelf, only to get referenced periodically to validate our actions and thinking. In many cases, the plan almost acts as a history book, or “bible” for our strategic thinking. And that’s where the problem arises.

Optimally, strategic plans should be dynamic, living documents. While it may seem odd to some, I am not a real advocate of documents that “memorialize” the strategy. I’d rather see the documentation focus on the assumptions, and the strategic options we would employ as these assumptions pan out or change. This kind of “options oriented plan” puts more emphasis on the process and the underlying assumptions, than it does on memorializing the strategies and tactics chosen to respond to a fixed set of assumptions. “Options based planning” and its close relative “Scenario Planning” are both examples of dynamic planning. In both form and function, they are far more conducive with the reality of change that occurs daily in our business lives.

I offer these 10 questions to help you discern whether or not your strategic plan is “dynamic” or “static” from the standpoint of dealing with the realities of business and environmental uncertainties and change:

1. To what extent does your vision, mission, and key objectives “guide” versus “prescribe”? – I like to think of this as an airliner on autopilot, in which the system maintains the altitude and attitude of the plane within a band of acceptable limits that correlate to the set parameters. It is acceptable for the plane to deviate slightly, as long as it is within close proximity to the preset parameters, thus avoiding undue stress on the aircraft.

2. Does your organization have a “compelling narrative”- a strategic story that aligns with core elements of your strategic plan? In other words, how easily can your overarching strategy, mission, and key objectives be translated into a “30 second elevator speech” by each of your executives and key stakeholders? Or would their natural response be to go search for the most current strategic planning binder? Does your narrative contain a good description of “what success looks like“? Does it reflect overarching principles, or a prescribed set of tactics? (at this level, the former is preferred to the latter in what we call “dynamic planning”)

3. How aligned would that narrative be from stakeholder to stakeholder, and executive to executive? Is the core theme, “embodied” into the fabric of the organization? And can it be repeated, at least thematically laterally and vertically across the organization? Do your stakeholders understand the tactical flexibility that they have in implementing the strategic vision, or are they looking for a prescribed set of “to do’s”? Remember, a good strategic foundation/ narrative, does not prescribe tactics, but establishes a strategic direction in a way that allows lower levels of the organization to identify, relate to, and ultimately link into with corresponding tactics. It doesn’t define their specific actions. A good narrative will produce those naturally in the tactical phases of your process.

4. Does your plan allow for changes in the operating environment, or is dependent on today’s snapshot of the current situation? For example, is the plan to become a competitive provider of business services (something that is based on today’s competitors and their position), or the low cost provider (which allows for the realities of new competitors and business models)?

5. How balanced is your strategic plan and roadmap? Rarely does a focus on a single measure survive past the current operating environment. In the above example, would we focus exclusively on cost, or would our strategic ambition include other areas like service delivery and customer retention? In the “autopilot example” in step 1, the plane does not fixate on only altitude, but also involves attitude, pitch, and other variables in its parameters.

6. Do the intermediate levels of your plan embrace the potential for different scenarios and contingencies? That is, do you have multiple options for achieving the same business model and outcome? It’s OK to have higher levels of importance geared to one of your strategies or tactics. But to become fixated on one strategy that has a 60% probability of success is shortsighted.

7. Does your planning process include some analysis of options value/ alternatives? Options strategy can be of enormous value in a strategic planning process, and the lessons learned here can be significant. For example, one of my past clients was able to discern the difference between saving a dollar of O&M versus saving a dollar of capital- almost a 7:1 tradeoff. Using that kind of analysis can really inform your planning and subsequent decision making processes.

8. Do your roll-down performance measures reflect the same level of balance, flexibility, and outcome orientation as your top-level plan does? This is really a reflection of how “connected your plan is” throughout various levels of the plan architecture. But it also says volumes about the degree of balance between your objectives and the level of completeness in your tactical and operating plans. For example, if your tactical plans and performance targets were achieved, would there be a corresponding level of success for each of the key strategic options identified for implementation?

9. How often do you review/ iterate your plan in an effort to “rebalance” and evaluate changes in contingency options? For example, does the review look like a once a year “dusting off” of the plan, or do you continuously review (monthly or quarterly) the relevance and changes to key assumptions and scenarios?

10. Does your plan and performance measurement system have “strategic staying power”? Can you effectively differentiate between a static plan that doesn’t change, and what we call “strategic staying power”? By the latter, we mean that once measure have been put in place and are renewed during plan review, do those measures survive changes in organization and personnel? One of our clients actually employed a system that implemented a “vesting approach” where managers were compensated on success of a particular measure whether or not they still had direct accountability for that area. This helped compensate for rapid turnover environments in which managers would otherwise remain shortsighted as they “eyed” future opportunities. Instead, they ended up with high degrees of “carry-forward alignment” and teamwork in helping their successors achieve success.

In short, you don’t want your plan to get so locked onto a specific tactic or objective, and lose sight of other options and contingencies that would contribute equally or more to your overarching definition of success.

I know there are some that see a “lock in, and implement at all costs” approach as far superior in that it maintains focus and eliminates the distraction of continuously iterating the plan. They may see the embedded flexibility here as a bit of a contradiction- something that prevents strategic focus. If you are in that camp, I would encourage you to look more closely at the distinction between different levels of the process. While I do endorse analysis, definition of options, and contingencies, I do concur with “locking in” on the business model, strategic intent, and the overarching narrative of the business. At the same time, however, I like to see a process that allows for identifying various ways to achieve the planned outcome, ways of dealing will fall-back contingencies, and the ability to revisit the underlying foundation as a last resort when operating or environmental conditions change.

And above all, remember this- planning is a process, not an outcome. If you maintain that perspective, it will be a lot easier to implement a planning solution that is dynamic, flexible, and effectively drives long term success.

Tomorrow, the temperature in the Northeast is expected to be back into the upper 80’s/ low 90’s. Some of my plans will change based on that. My plans for the weekend might look more like a “summer plan” than a “fall plan”. We roll with the changes in the environment we live in. We accept change, and if our minset remains open, we can actually thrive on it.

As I look at the news today, there are many down on the South Central/ and South West Gulf Coast whose plans will no doubt be changing this weekend with the approach of Hurricane Rita. Our thoughts and prayers are certainly with them. That said, there is no better way to explain the importance of a flexible planning perspective, than to look at what our brethren down south have been dealing with for weeks. We can learn much from them, in particular those who can roll with the punches and still keep perspective on what really matters- principle wise. Those are the true hero’s from which we can learn much

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


 

Peer Benchmarking Initiatives- Revisited

Over the past several weeks, I’ve gotten more than a few comments on my June 16th column regarding Peer Company sponsored initiatives. Given the volume of comments, which ranged from “spot on” to “downright delusional”, I thought it would be a good idea to take another look at this topic.

First, my acknowledgements to some of the more prominent programs that were mentioned in the article. My intention was not to endorse or condemn any of these programs specifically. In retrospect, my biggest failing may have been the fact that I may “painted them all with the same brush”. That was not my intention. My failure to discern between the “good” and the “not so good”, while perhaps frustrating to the sponsor companies, was however, deliberate. The programs I mentioned by name were mentioned only to help the reader identify with what I meant by “peer company sponsored initiatives”- NOT to endorse or condemn any one in particular. If it was interpreted as anything other than that, I do apologize to both the program sponsor and their participants.

For clarification, my main objective with the article was to offer a guide, or checklist, to help the reader discern what to LOOK for, and what to LOOKOUT for, in such programs. All programs, whether sponsored by peer companies, consultants, or independent facilitators have strengths, weaknesses, and risks. In fact, if you look through my past columns on benchmarking (see article index), you’ll find that I offer similar analysis and guides for other types of programs as well. There is no perfect solution. Again, my only objective was simply to help the reader discern what is best for them.

This year alone, the number of programs available (peer sponsored, and others) will nearly double. And because of this, many companies are facing the tough decision of which ones to participate in, and which ones to pass on. Unlike the early 90’s, resource limitations prohibit companies for participating in everything out there. And despite what may be advertised by these programs, none of these programs are “free” on any dimension.

Our research has shown the cost of data collection alone to be many times the “entry fee” of such programs (assuming there is one). Offering a way for the reader to pick and choose in an educated manner was, again, my main objective. My other objective, while a bit in the background, was to encourage the facilitators of such programs to respond to these risks, and to help mitigate them for their members.

For example, within a few weeks of publishing the June column, I learned that one of these programs now requires executive review and approval prior to admitting a new member- a tactic that clearly manages one of the key risks identified. As new programs come on line, I encourage the facilitators and members to remain conscious of these risks/ issues, and continue managing them as appropriate. Those that do will no doubt end up as the best “draws” for future members.

As a refresher, I recommended several key questions for companies considering peer sponsored benchmarking initiatives. These questions are just as relevant today, as they were in my original column. Some of the peer-sponsored programs manage them well, and some don’t. As I indicated in June, and barring any formal comparison of these initiatives (something we may elect to provide in the future), it’s up to the reader to decide what is right for them. These questions/ issues are offered only as a guide to help inform the prospective member.

1. Do you know the GENUINE REASON the company is offering such a program? (Is it documented, written down and accepted by executives of both the sponsor company and the participating company

2. What is the REAL COST of the program? Again, both to the sponsor company’s shareholder, and the member? What will the real cost of data collection be? Is it redundant with other programs? What is the sponsor company doing to mitigate this cost?

3. How do the program INTEGRATE/ interact with other similar initiatives? Do they compete against them (creating more redundancy), or will they partner on data and other types of integration?

4. Does the program require both managerial and executive level APPROVAL and OVERSIGHT? Are competitive concerns and/or antitrust issues known and mitigated.

5. How will they PROTECT your data? What assurances do you have?

6. How ROBUST is the membership? Are there enough companies in their membership to provide meaningful information for your particular demographic or type of infrastructure?

 

While this may not be an exhaustive list, it is a start. I invite any of you to add to this list via commenting on it, and I will publish any additions/ modifications in future columns.

To me, some of these issues are obviously more important than others. As I go through the above list, I believe the most important issue to be managed TODAY is that of redundancy and duplication of resources. And as more of these programs come to market, this is a cost that will get more and more visible. For example, it would be nice to see some significant effort to merge data requirements so that the member only needs to collect the information once. Some of this occurs today, but only in a very informal and ad hoc manner. More often than not, these programs end up “competing” with each other for very scarce resources. While each program may have something unique to offer, most require very similar means (data required) to arrive at their specific end point.

An analogy to consider- There is a reason why there is only one set of wires running down my street. Anything more would result in stranded investment and underutilization of assets. And in a world of scarce resources, that can’t be good for the buyer. Likewise with benchmarking initiatives. There is a lot of this type of redundancy and stranded investment in the world of benchmarking. Call it wishful thinking, but why not have some type of “data clearinghouse” that feeds data to each program based on what it needs, but eliminating the data collection duplication that is present today.

We must approach, and address each of the other risks in a similar manner. Only in this way can we have programs that truly offer a win-win for both the member and sponsor.

Again, if I offended any of the program facilitators or members by my words or tone in the June 16th column. I sincerely apologize. But I do, and will continue to strive to offer observations and feedback that strengthens our collective ability to better manage our performance. And to this end, you comments and feedback are both welcome and appreciated. Please direct any of your comments to

rchampagne@epgintl.com

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com