Lessons from the Fairway

Over the past week or so, I had the privilege of preparing my two boys, Bobby (10) and Michael (8) for the USKids Golf Qualifier for NJ. As most of you know from my past posts, they have both been avid golfers since they were toddlers. Their competence far overshadows anyone in our immediate family, so I have no idea where they get it from. Nevertheless, they perform astoundingly well for their age, and the pride we have for them is beyond measure.

Even more amazing than their performance however, in the number of life lessons and “take aways” that come out of every competition they enter- not only for the players, but even more so for the parents. Although there were many personal learnings that came out of this week’s competition, there were a few that lent themselves particularly well to the discipline of performance management.

Unlike most of their tournaments which play 36 holes over two days, this particular event was restricted to only one 9 hole qualifier. While it may seem easier on the surface, this particular format leaves a lot less room for error. That is, while nearly every player in a 36 hole event has their share of ups and downs, a 9 hole event demands near perfect execution. If their game is “on”, kids that age can typically perform well for strings of 9-12 holes before having a few rough patches. Few of them can do that in a 36 hole stretch. So as we prepared, we focused and practiced on keeping the performance going for a solid 9 hole stretch, getting off to a good start, and keeping it going for the balance of the round.

When we arrived at the course on Tuesday morning, we were confronted with pretty bad weather conditions- wet and rainy. Typically, after a rain shower, the greens play much softer, requiring the player to putt a little firmer than they would normally. While, my 10 year old is a near perfect putter inside 6 feet, the rain created a problem for him right off the bat. He misread the first green (expecting it to be slower than it ended up being), and blew the putt 2 feet by the hole, leading to a double bogey. A tough way to start, requiring him to dig himself out of a very deep hole. More importantly though, that first misread, forced him to be overly “tentative” on his putts most of the day, missing 7 putts inside of 6 feet, usually by no more than a few inches. And the score showed it. Having had practice rounds of 45 and 44, he ended up with a disappointing 49 (far better than his old man could ever do, but still disappointing to him)- :(

Naturally, as any parent would, I tried to talk him through it, putting an optimistic spin on things-showing him what he did well, and looking at areas to improve. All in all, he concluded that the day wasn’t as bad as the score suggested. In fact it was a pretty darn good day. Net of the putting problem, we might have actually broken 40, which would have been his best round ever and certainly the winning score. Of course that’s playing a little of what I call “would’a, should’a, could’a”, but at least it gave him another perspective on what could have been a really demoralizing round. Instead he walked away with one big lesson about putting strategy in inclement weather, not a retooling of his entire game.

What can this teach us about the discipline of performance management? We’ll let’s look at how organizations gauge what they need to improve and when. Just like the golfer that draws inappropriate conclusions from the overall score (the equivalent in business would be looking solely at ROI, ROA, or EPS), businesses can be seriously damaged in the same manner. You’ve got to have a good dashboard that will enable you to assess your “whole game” before you start tinkering with processes, strategies and business models. It could just be one small facet of your game that’s out of synch rather than the entire business model. In my 20 years in performance management, I’ve seen too many cases where businesses, on the advice of misinformed advisors and consultants, have retooled their business to correct problems that didn’t even exist. A good performance management dashboard can prevent such mistakes, and tell you exactly what needs work and what doesn’t.

In the case of my son, the dashboard looked pretty good. Drives were long and generally straight, and he was able to hit many greens in regulation (meaning his long and mid range game was well in tact). These are two of his main indicators on his personal performance dashboard. In cases where he didn’t hit greens in regulation (2 strokes on a par 4, and 3 on a par 5), he was able to pitch or chip the ball to within only a few yards of the hole. So his short game was very much “on”, another key item on his dashboard. In fact, when we reviewed his putting, his “line” (aim) was pretty dead accurate. The only area that didn’t work for him that day was his distance control, which on a “normal day” would have been quite good. So it wasn’t “distance control” in general that he needed to work on, but more so, how to adjust his distance control in inclement weather. To sum up, there was one very specific “thin slice” of his game that needed work. Although his round suggested he was way off the mark, his dashboard suggested otherwise. A little work on that area, and he’s back to breaking 40. Not bad for a 10 year old.

For some, the above may look like I’m just being overly optimistic spin on things- a “glass half full father”. That sure is part of it. But, in this case, the optimistic viewpoint was grounded with clear measures of performance for every aspect of his game. If you have good measures that are linked to overall performance, then you can look at the dashboard and see pretty quickly whether a positive or negative interpretation is in order. Without the dashboard, you’re really flying blind, with little if any idea of whether or not massive change is needed.

So next time you’re faced with a bad quarter of performance, try looking a little deeper at your total dashboard for the answers. In some cases, you’ll have to dive into some pretty heavy process and organizational changes. In other cases, and perhaps more often than not, it may only be a minor adjustment that puts you into the zone of performance excellence. Only your dashboard can tell you that.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com

The Good and Bad of Consultant Sponsored Benchmarking Initiatives

Over the past few days, I’ve been preaching the message: “BE LEARY OF THOSE WHO WANT YOUR BENCHMARKING DATA!” I’ve talked about new technologies and internet sharing platforms, peer company sponsored initiatives, and regulators among others. One group that I left out, deliberately, were the management consultants who run these types of consortiums. This is much harder group to discern, largely because there are many types and flavors of consultants to choose from.

I must preface this column with the fact that there are a few “good guys” out there. Programs that have been around a long time, and run these programs for the right reasons- to help their clients keep a pulse on their performance. They don’t abuse the data they get, and they keep a clear line between the benchmarking and consulting side of their business. These organizations exist and are easy to find if you know what you’re looking for.

There are other consultants, however who are after two things: BIG relationships and BIG Projects. Many consulting firms I know that run benchmarking and similar types of consortiums for one purpose- to “wedge inside” your business, identify problems, and then build a MUCH BIGGER relationship with you to to help you solve those problems. There’s not a whole lot wrong with that. After all, I was one of those consultants in my past life. But it’s when the ONLY reason your doing those benchmarking programs is for the bigger project, that things begin to go south. Remember, there are many “best in class” performers who do benchmarking to simply stay in touch with where they sit vis a vis others. I’ve seen too many of these companies get very distracted by what I call “consultant manufactured projects” they didn’t need.

As I stated before, there are some consultants who don’t move in this fashion. But there are many who do. Be on guard as to why your consultant wants you to share your data, and use the checklist posted in our previous column “Rules of the Road” to help you find the partner that is best for you.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Peer Company Sponsored Benchmarking- What to Look Out For

In many industries, there exists companies who sponsor benchmarking and other types of “data sharing consortiums”. In the Utility Industry, within which we do a significant amount of work, there are a growing plethora of organizations who want you to exchange data openly with them rather than with consultants or other facilitator-brokered services. PSE&G Peer Panels, Southern Company, and others are among the top “draws” for this kind of information sharing among utility organizations. Other industries have similar types of players.

It is not my intention to label these programs “good” or “bad”. In some cases they serve a valuable purpose for many of the participants. But you should also know that these programs are laced with risks- risks that must be understood if you are to manage your involvement proactively.

At the core, you need to ask yourself WHY these companies would offer such programs, other than to gain valuable competitive intelligence, potential acquisition analysis, or other covert reasons. If you’re concerned about their motive, don’t join.

But first, you need to get past the “lure” of these programs, as they all are VERY good at drawing companies in with the following arguments:

– They say the reason they manage these programs is to offer a public service to the industry. Sounds all too altruistic for me. See below- There is no free lunch….

-They say these programs are free. News Alert- they aren’t! Someone always pays. If it’s the shareholder that pays, then 9 times out of 10 they are looking for competitive intelligence. If its the ratepayer who pays, then these companies are about to have bigger problems on there hands, as there are few ratepayers who would support giving their money away to ratepayers in other jurisdictions. If you disagree, find me a few of them who are. Is there even one out there?

-They will also tell you that they provide these programs to “protect you” from the BIG BAD consultants. We’ll here’s another News Flash: There are TOO MANY alternatives out there for you to sell yourself out to your peer companies (future competitors). If you want to stay away from consultant sponsored initiatives, look to some of the other alternatives before you decide on the peer company initiatives, as the latter are a bit too risky in the long run.

-They market to “manager” rather “executive” level individuals. Why? Because mid level managers are more apt to share data without worrying about competitive concerns. Career advancement, workshops in cool locations, and networking are among the biggest drivers for these managers. Executive staff have many more concerns about confidentiality and the value of protecting strategic data and insights, and are often in a much better position to judge when and how to make such tradeoff decisions.

– They will tell you that they are the only option if you want to have “lots of participants that look like you”. True, these programs are good draws. Also true that these companies look a lot like each other- same industry, same region, similar regulatory environments, similar management practices. But is this necessarily an advantage? Perhaps the biggest commonality between these companies are data sharing protocols may be a little “too loose”. It’s also worth pointing out that groups of 20 or so in a sector like Utilities is still a small fraction of the industry. If you total up all utility companies worldwide (and despite conventional wisdom, ther IS a lot to be learned from off shore peers!), there are literally thousands. Once you take into account that many of their members are big holding companies with 4 or 5 subsidiaries, you’re left with maybe 2% of the industry. Hardly a quorum!

Once you get past the “lure” of these programs, you can then begin to filter out the good from the bad, or at least identify the ones where the risk/reward profile leaves a lot to be desired. Are there good programs out there? You bet. But it’s your job to evaluate your benchmarking partner on each of these factors. It’s also important to have a good “rules of the road” checklist (see past post “Rules of the Road”) to use for every invitation you receive relating to data sharing. Without this, you put your company, and yourself at risk.

Benchmarking is a fact of life in best performing organizations. Withdrawing from the game of data sharing is NOT the answer. Managing the process proactively IS.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Data Sharing “Rules of the Road”

Over the past several years. My partners and I have spent considerable time and effort applying our performance management technologies in the Energy and Utility Sectors, among others. One of the unique things about that sector is the “extremely open” nature with which they share performance information and best practices with each other. Perhaps a little TOO OPEN? When was the last time you heard of Proctor and Gamble sharing information with its closest competitors THAT openly ?

Am I saying that you should stop sharing information all together? Far from it. What I am saying though, is to TAKE SPECIAL CARE when doing so. You can instill a learning culture and share peer to peer information, but don’t do it without taking the right precautions. As much of a paradox it is, you can BET that P&G is one of the best benchmarkers and learning organizations around. They are just very deliberate and careful about how they do it.

Here are some tips that will help you “manage” the information as you go about your benchmarking and best practice acquisition process.

a)- Don’t even THINK about sharing information in an UNBLINDED fashion. This is cardinal sin #1. The ONLY reason someone would want you to do that is to be able to strip it down and glean information for competitive gain. If the information is blinded, the consultant won’t be able to target you for that lucrative project unless YOU want him to. And that overly philanthropic peer company won’t be able to use the data for competitive positioning. Only you will be in control of your data assuming it stays masked. Peer companies will often tell you that having the data unblinded is necessary in order to maximize value from the program. Hogwash. There are TOO MANY ways to foster learning with blinded data that I have discussed in previous posts for you to give in to that kind of BS. Either share information in BLINDED FASHION or NOT AT ALL.

b)- Don’t begin without clear Executive approval, make sure they know EXACTLY how the program will work, make sure you know what the PROTOCOLS for that sharing are for your company, and follow that process diligently. Trust me, your executives are the only ones with a broad enough purview to make good decisions and tradeoffs between information sharing and shareholder value. Second, is a deep concern I have about decentralized sharing that is not managed centrally. True story: A member of one of these “peer company” sponsored initiatives told me that despite an “iron clad” confidentiality agreement, and the decision to mask data, the FIRST thing that occurred at their results meeting was an exchange of identity codes! No kidding? Do I believe this was endorsed behavior? Absolutely not. It was, however a direct product of information sharing being too decentralized- to the point that the employees sharing the codes had lost touch with their corporate policies on information sharing.

c)- Demand Confidentiality / Non-Disclosure agreements. I’m not talking about these little “we won’t tell anyone if you don’t” type of statements. I’m talking about agreement that will hold legal steam. Be clear about when and how the information can be used. For example, our data cannot be used outside of x, y or z departments for purposes other than a, b, and c. If the information is used for purpose d (e.g. acquisition analysis, competitive targeting, etc.), be clear about the penalty and how it will be enforced (who will enforce it, what jurisdiction, etc…). Also, the less “blinded” the information is, the more complex the non disclosure agreement must be. The first line of defense is in what you share and what the peer company can glean from the data. The non disclosure is your SECOND line of defense. Once they’re in, they’re hard to stop.

d) Avoid consultant and peer company sponsored initiatives. Then who do I use? There are many sources and technologies out there, from published studies, to internet driven benchmarking tools and services ( http://www.benchmarkcommunities.com/ ,for example). The key here is to avoid anyone who doesn’t have benchmark facilitation as their CORE business. The farther their core business is from the facilitation of these programs, the farther you should stay away from them. And by the way, there are consultants whose sole business is benchmarking. In terms of benchmarking, these are the “good guys”. Remember though, these companies are the exception rather than the rule as far as consultants go. So be on guard.

e) Make sure your partner’s data management process is bullet proofed- Even with an iron clad NDA in place, a poorly configured process can be as dangerous as not having an NDA at all. Look for assurances from your vendors or partners that their process is secure. Ask to see their process. Was it audited or tested for compliance? Is it ISO certified in these domains? Is the data transfer technology and platform secure?

There you have it. A nice checklist to go through each time someone invites you into a .data sharing environment. Data sharing can be a very rewarding game if it is played right. But you need to be both cautious and prudent about the process. It’s kind of analogous to “let the buyer beware”, only we’re dealing with bigger companies, more shareholders, and bigger stakes!

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


The Benchmarking Technologies You DON’T See

Not too long ago, benchmarking anything was a major undertaking. Consultants and/or host companies would take us down a long and painful road of data collection, validation, workshops, and reports, all of which would ultimately lead to a voluminous reports that probably still serve as “ornaments” on your bookshelf. Clearly, a painful process that often led to frustrated participants and meaningless information.

Well, now that we have all these great new technologies to help us with benchmarking, those times are gone, right? Wrong. Despite numerous advances in both approach and technology, benchmarking still leaves a lot to be desired. Why?

Not unlike many of our business processes, benchmarking suffers from our tendency to focus on activities rather than business outcomes. We spend lots of time and money automating legacy processes, instead of truly rethinking how our process SHOULD work. The same is true with technologies that should, but don’t enable us to do better benchmarking. There are clearly no shortage of benchmarking technologies that companies can use to acquire and report benchmarking data. But are these systems actually producing valuable information?

In my humble but vocal opinion, today’s benchmarking technologies suffer from major weaknesses in EVERY stage of the benchmarking process:

1. Survey Design and Administration– most consultants and facilitators have put ALL their emphasis on the survey process. Poor prioritization, in my view. Sure, the survey is the most obvious process to try and automate. Since most data collection started as a manual process, it would only make sense to try and streamline it. What’s wrong with that view? Nothing really, except for the fact that most data collection PROCESSES lack both the strength and sophistication that are key in a good benchmarking program. So what we end up doing, in essence, is automating a pretty crappy process.

For example, the internet now allows us to collect data online. Big Whup!!! If that’s all that you expect of your technology, than you’ve got bigger problems with your process. Data collection is the foundation of your whole benchmarking program. Many things are (or at least should be) accomplished in the data collection process. Putting data on a survey is only one of them.

Distributed data entry, error checking, aggregation, internal vetting, boundary testing (against specific definitions), external validation, and range checking, are among the many other functions performed at this stage in the process. If the OUTCOME of data collection is QUALITY information, collected with the LEAST PAIN, in the FASTEST CYCLE TIME possible- then simply automating your old excel spreadsheets has done nothing but administer your current process via the internet. It’s just another route to the same old destination. Nothing more.

Technology today, allows us to do SO MUCH more. Good benchmarking applications will address ALL of the components of your data gathering process, automate many of them, and turn your process into something that is better, faster, cheaper and less painful than your existing one. That’s the true test for any benchmarking technology.

2. Results Presentation– This one is very connected with data collection. If your technology addresses all of the major aspects of data gathering, then you should only be one or two quick steps away from seeing your results. So why does your facilitator need weeks or months to deliver your results?

a)- because your facilitator’s technology didn’t likely deal with data collection the way it could, or should have. Validation didn’t occur at the point of entry, did it? Data still had to go to the facilitator for aggregation, didn’t it? Some definitions were misinterpreted, weren’t they? All of this adds layers of cycle time to your process. The fact that your data was transferred over the internet bought you nothing in terms of being able to use it any better or faster that you would have before.

and b), because your facilitator ONLY focused on the data collection process to begin with. Today’s technologies allow for your data to become instantly part of a relational database that can be easily queried and manipulated. More importantly, the internet allows for that querying to be done in a distributed manner. Which brings me to the main benefit in the reporting process- CUSTOMIZATION!

So, here we have two more requirements for a good benchmarking technology. ON DEMAND reporting (i.e.- instantly upon data submittal), and CUSTOMIZATION (i.e.- you define the form and function of how you view the results). Suddenly, on demand filtering of the peer group, “what if ” and scenario testing, and many other possibilities begin to emerge. Does your benchmarking software do that?

3. Best Practice Sharing– This is one of the big gaps in today’s benchmarking technologies. The tragedy of this one however, is that it is not simply an oversight. IT’S DELIBERATE. That’s right. Benchmarking facilitators make their money by being the information broker, a service that was necessary 10 years ago. But today, they avoid these technologies because it moves them out of the loop, or at minimum changes their role. And change is not a nice word to these types of folks. Turf protection, protecting their job through retirement, etc.. become the real motivators. And who pays for that? YOU DO. The technology exists to make learning and best practice sharing a CONTINUOUS part of our day to day jobs, not a once a year event that puts your benchmarking process at the mercy of when he or she wants to call a conference.

Don’t get me wrong, conferences are fun, especially when combined with little boondoggles like baseball games and trips to the big city. But when they are used to disguise the obstructionist role of data broker is when I call foul. A good facilitator will provide the technology to facilitate on demand peer to peer sharing and surveying. If they don’t, you’re better off finding a new source for your benchmarking information.

As you can tell, this is a subject that I feel strongly about. When I see technology that is avoided because an individual or organization wants to slow down or limit change, it infuriates me. I see it too much, and its about time it changed. I encourage all of you to exert all the pressure you can to get your providers to use your fees, or collective effort , to FULLY employ these technologies instead of using YOUR resources to protect their little patch of turf. Sure, it will create less dependency on them for benchmarking, but if they play their cards right, it will result in a stronger relationship bond and a higher degree of business trust and integrity.

-b

PS- Here’s a tip. Want a better better benchmarking technology? -One that actually saves time, money, and employee frustration, while dramatically improving the quality and usefullness if information?

Visit http://www.benchmarkcommunities.com for a powerful solution that will help your benchmarking program serve you better!

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com