Building Sustainability Into Your Performance Management Program

Like some of you, I always enjoy watching the final round of a good golf tournament on a Sunday afternoon, particularly if it’s a close race to the finish, or if there is someone I like to follow in the final few pairings with a decent chance to win. But I think the most enjoyable tournaments I’ve ever watched involved a good “comeback story”- whether it is the overwhelming underdog about to claim their first real victory, or the encore of an otherwise “past legend” of the game.

Such was the case this weekend watching the Northern Trust PGA stop at “Riviera” outside of Los Angeles. Normally, this past Sunday’s finish would have been one of those I viewed in the background , since most of the golfers I normally follow had already worked themselves out of the tournament by Friday afternoon.

Usually by Saturday evening, you know who is likely to be in contention, as those who are either struggling, or have taken steps in the wrong direction have moved themselves too far down the leaderboard to have a legitimate chance. Needless to say, as I approach that magical age of 50, having someone that I like to follow on the top of the leaderboard is becoming a rarity these days unless I’m watching the Senior Tour.

But this Sunday was different. For the first time in a while, two of the “older farts” that I like to follow were actually making a run at it. Those were  VJ Singh (47) and, none other than, Fred Couples (51)- the latter of whom I grew up watching as a kid. When I was in high school, I can remember watching “Freddy” come on the scene and, within a only few years, begin to contend with all the golf “Masters” of that era (literally).

In sports, “over 40″ is considered “old” and “over 50″ is usually considered “time for the senior tour”. Except for the “honorary” roles at a few of the Major events, it is very rare to see someone over 50 in contention on Sundays. And when they do, it is most notable. When these players beat the odds and simply  compete well (even if they don’t win)- Tom Watson at the recent British Open for example-it is a special moment. But when they win, it is literally something to behold- something reserved for true legends. In fact, a win from Fred this week would have made him one of only three golfers (alongside Snead and Floyd) to win in four different decades, a true measure of “sustainability”.

As it turns out, Fred didn’t win this week, due in large measure to a bad performance on a single hole which essentially took him out of contention, ultimately falling to the young Ausie, Aaron Baddeley.

One hole… that’s all it took to create a 3 stroke swing that killed most of the momentum built over 4 days and 65 holes of solid golf. Sad? A little. Here’s a guy over 50, riddled with a chronic back injuries, who routinely wins or “places” on the Senior Tour, and who was actually in contention with 6 holes to play alongside a guy half his age. Impressive any way you look at it.

And that is what got me thinking about SUSTAINABILITY. What is it that differentiates certain athletes to be able to sustain performance over literally decades? And how can we apply these lessons to business success, and perhaps our lives in general?

Interestingly, those athletes who sustain performance over many years, are sometimes those that never reach that elusive “#1 ranking” in there sport. They may have been #2 or #5 or even lower, but they were consistent in their performance over much longer durations, usually “hanging around” long after the #1’s had fallen or left the sport. Same with Businesses. Companies who may never achieve #1, can be just as successful by being in the “top few” (even the top decile or quartile) if they can perform at that level in a sustained and measured way.

So what is it that makes that difference? Here are five factors that I submit as key answers to that question.

  • Build “Around The Core”– Over the course of an athlete’s career, or a company’s history, the likelihood of going through multiple periods of change is almost certain, as is the probability that more than a few of those changes will be of large magnitude. But despite this, those who sustain their performance usually have a “core” that they develop and build around. For an athlete, this is usually referred to as a playing “style”- a golfer’s unique swing, a quarterback’s throwing motion, etc. And while that “style” can be tweaked or refined from time to time, the core elements of it usually transcend different periods of a career. For good businesses, this usually shows up in the form of vision and values. While specific missions, goals, KPI’s and strategies will no doubt change over time, the core vision and values, generally don’t.
  • Strategic Flexibility and Adaptability– Some may view this a little contradictory to the above point, but here is the distinction: While the core tends to remain stable, the strategies and tactics can, and should be somewhat fluid over time. Golfers and other athletes always “tinker” with different parts of their game and often solicit advice from coaches on what may be failing them at any particular point in time. They “adapt” their style to what may be needed to make themselves better. But rarely does this change the “core” of what defines them in terms of their long terms success. And when they do change something, its usually “off the course”. That is, they generally don’t change a fundamental strategy during the round, but rather do it on the range or in a practice round. When they “hit the course”, it’s generally all about execution. Businesses too, need to adapt to changing market conditions, buying patterns, economic climates, and numerous other factors; but at the same time protecting the core of what distinguishes their excellence.
  • Broaden the “Perspective”- My view of athletes who maintain sustainable success is that they do in fact modify their goals over time. I originally wrote this as their “openness and willingness to modify a target”, but that conveyed  more “weakness” than I really intended. It’s not that they change the target because they’re getting further into the lifecycle of their career, as much as it is changing the ultimate “perspective” and “horizon” around which they measure success. First time winners of the Superbowl start thinking in terms of career goals versus simply seasonal goals. Golfers start thinking about world rankings and career wins versus weekly tournament successes. No doubt, every one of these athletes wants to win week in and week out, but I suspect most would sacrifice a short term gain if doing so jeopardized a longer term aspiration.
  • Keeping the Team “Healthy”– It would be hard to talk about sports or business “dynasties” without talking about the importance of keeping the team in tact and healthy. For athletes, this means literally. Many an athlete has ended a career early because of injury. Stops and starts because of chronic injury is something that prevents sustainability. Sustainability requires a vigilance to keeping the body and mind healthy which usually takes an ultra-strong commitment to training on the field and off.  In business, the “healthy” team means doing your part as leaders to not only acquire the right talent, but to create an environment of nurturing ad development that supports retention and peak performance. It also means keeping “unhealthy” influences, behaviors and practices far away from the human capital you’ve invested so much to develop.
  • A Culture of Learning– As trite as it may sound, this may in fact be the most important common denominator of sustainable performers. Almost every “hall of fame” athlete we know appears to have that “hard wired” sense of learning built into them. It’s a hunger for learning that seems to disappear quickly after the initial successes of “one hit wonders”. But for sustainable leaders, that hunger for learning is almost obsessive. And the same is true of long term business success. It’s evident when you walk in the doors of these companies. Everything from the charts you see on the walls, to the type of conversation and dialog you witness, speaks of learning.

As always, these represent only a subset of what I believe are the most powerful differentiators of long term sustainable success.  I welcome your additions, comments and thoughts as well.

There are no doubt places in our performance management programs where we can apply these principles. Start with the core and build from there. Do you have that “solid core” of vision and values? Or is this something you constantly change from year to year. Are you flexible and adaptable with respect to your annual goals, objectives and KPI’s? Or have your operating objectives, measures, targets and strategies remained the same for years or even decades? As ironic and paradoxical as this is, many companies have this backwards. They constantly mess with the vision and values, yet they have objectives, measures and targets that rarely get challenged or updated over many years. That’s always an alarm bell for me.

But it shouldn’t stop there. For example, have you built the right perspectives and timeframes into your performance program? Do you have more than just short term goals for achievement? Do you also have longer term sustainability measures to complement that dimension of your scorecard? Have you used your performance management program as a tool to develop, nurture and retain your human capital? And have you really started down that road of continuous learning?

As I mentioned in a previous post, a good performance management program is not just about measures and metric reporting. It is about a holistic and integrated platform for building sustainable business success.

-b

PS- As for Freddy, best of luck at Augusta. I know he’ll be back. The decade is still young.

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com

Dashboards Versus Scorecards- Its all about the decisions it facilitates…

The one thing that most everyday drivers fear is that infamous “check engine light”. Unless its during the first few seconds of startup (the point at which every indicator on the dashboard lights up for a few seconds), a “check engine” alert is one of the few that signals that indicate big problems are imminent unless something changes fast…as in, stop the car soon and diagnose, or run the risk of being abandoned on the highway with a very costly repair. If you are someone who doesn’t take your vehicle’s indicator lights seriously, trust me (from experience), this is not one you want to ignore.

Dashboards, Indicators, and Alerts…

There are many indicators around us that alert us to changes in status of a process, and deviation from what may be considered to be a “normal operating condition”. And the place where most of these indicators are visible is on our dashboards. Whether it’s the dashboard of your vehicle, the cockpit display on an aircraft, the bridge on a ship, or a control room in a power plant; it’s the one central place where status is monitored and response strategies are determined, most often by the operator of the asset.

Of course, these deviations from the norm that show up on our  “dashboards” occur in varying degrees, and can signal very different things. A “service soon” indicator light your car dashboard is more “suggestive”, and usually means its time for an oil change or tune up. But you’ve generally got some time before it becomes a bigger issue. A “low fuel” indicator on the other hand, is a bit more significant, and usually means you’ve got a finite quantity of miles left before you are what we might call “SOL” (although I’ve tested this threshold on occasion and can attest to the fact that there is some (albeit small) “cushion” past 0 to rely upon). And then there is the “check engine” light that most often means PULL OVER ASAP ( as soon as safe and practical -but SOON).

I’m not sure about you, but I view the “check engine” light as analogous to to an “airspeed alert” that a pilot might get right before a stall condition, or a traffic alert he gets when another aircraft is within the allowed separation tolerance. You might not yet be “past the point of no return”, but you’re pretty darn close.

Dashboards and Scorecards: Is there a difference?

In the Performance Management discipline, we often hear people refer to the terms “Dashboard” and “Scorecard” rather indiscriminately, with little if any conscious distinction as to what they each connote. I’ve often avoided getting too “wound up” about this, because getting caught up in corporate “buzz phrases” and semantics can cause us to miss the bigger issues at play. But after reflecting on this a bit, I think the differences here are in fact worthy of some discussion. Not because the words themselves are super important, but because it is critical that both components (whatever you call them) need to be part of your EPM solution.

So here are is my take on the critical distinctions between the two:

  • Purpose: Dashboards are about helping you navigate the journey. Scorecards are about how successful the journey was.
  • Type of indicators included: Scorecards generally contain outcome results, Dashboards are usually comprised of leading or predictive indicators
  • Timeframe: Scorecards are periodic and longer term (weekly, monthly, annual trends) in the review horizon, while Dashboards are shorter term and can even be real time
  • Reaction– Scorecards should provoke next steps that involve introspection and analysis (drill downs, mining insights, etc.) where dashboards usually are designed to “signal” or “provoke” immediate actions or course corrections
  • Targets– Scorecards usually report against a target, threshold or benchmark as a percentage gap and trend. Dashboards generally report metrics within or against tolerance ranges, outside of which signal a required change

As with anything, these are not hard and fast rules, but they should give you a sense of where I personally see the distinctions.

Sure, there are some grey areas here. Outcomes for some, may only be a part of a journey for others. There are also cases where an an outcome indicator might be so important that it is worth tracking in both places- on the dashboard AND the scorecard. For example, some car dashboards have an indicator that tells you what MPG you are getting out of your fuel consumption in real time. But while MPG would normally be an “outcome” metric (i.e.scorecard material), it may also be useful to some of us in watching the degradation or improvement to MPG as we change driving patterns (rapid “gunning” and braking, versus more constant speeds, for example).

Examples for the Fairway

A conventional golf scorecard

A few days ago someone posted about this same topic, using golf as their main analogy. And while I agreed with most of what he said, some of his examples created pause as I though of my time on the golf course.

We both agreed for example that the “stroke count versus par” was what you would always find on a conventional golf scorecard (hard to argue with that!). However, I would also say that stats like # of greens in regulation, puts per green, club distances, etc. should also be part of your scorecard, although maybe at a level or two down the chain. After all, these are things that need to be analyzed and challenged OFF the course (although I have been known to peak at them from time to time during the round). In fact, it is not uncommon for many golfers to track these very stats on their scorecard right underneath or beside their actual stroke count.

But if you put all of that on the scorecard, then what does the dashboard look like? We’ll if we think about it in terms of the 5 criteria I provide above, it would likely  be things like yardages to the hole (what i need in order to make my club selection), wind direction (what I need to shape my shot), # stokes ahead or behind the lead (what I need to manage my strategy), speed of the green (necessary in determining the line and speed of your putt), and the myriad of other factors that are utilized by professional golfers before and during a round. And while many of us may manage the above by “feel”, just take a look at a professional’s yardage book and caddy’s notes and you’ll see what looks strikingly similar to a dashboard (albeit manually illustrated with stray marks and notes). And if you want to spend the money, you can always buy some pretty cool dashboards for your iphone or blackberry.

Some Final Thoughts…

I think some of the confusion between dashboards and scorecards is because metrics are often combined in the same visualization, regardless of whether it is called a scorecard or dashboard. Even automobile dashboards have a place for total accumulated miles. Golf GPS devices enable you to enter score count. As I said before, the distinction is sometimes burry, and often not even that critical.

What’s much more important is whether or not you have the full compliment of indicators you need to manage the business. When most executives ask their teams to develop “a dashboard”, the content of what they are really asking for is unclear. Are they hungry for better tracking of results? Or are they asking for better metrics- those that will enable better decisions and more responsiveness? Or are they simply looking for better analysis of the results?

Unless you understand that, it will be hard to deliver on any of these requests or mandates, regardless of what you end up naming it. In the end, Scorecards and Dashboards are merely visualization tools. What';s more important is that you embed and align the right content into these tools that will enable a clear line of sight between vision, objectives, KPI’s, metrics, and initiatives that tells the complete story and enables those who are in execution roles to be successful.

The bottom line is that you are the designer and architect of the info that is displayed, and so all these distinctions- whether it is between scorecard and dashboard, long term versus short term, leading versus lagging, etc.— are really only important in terms of their usefulness in helping YOU design a system that is relevant and useful within your organization. What you call these things is not near as important as whether the system produces the right outcomes.

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com

Don’t Go Overboard on KPI’s

While much has been written in the past about performance management, most of it has dealt with things like the design of measures, development of targets, benchmarking, reporting methods, and IT solutions. Precious little has been written on the quantity of measures…essentially the question of “how many” measures an organization should have as you begin to cascade past the first few levels.

As most of you know from my past writings, I am a big fan in the “fewer is better” principle, the reason being that focus becomes distorted once you get past a certain number. Quite frankly, I don’t know psychologically why that is, nor do I really care. The less people need to remember, recall, and process, the more likely it is to stick. Ever wonder why things like social security numbers and phone numbers are broken up into three to four digit “clusters of numbers”? It’s been scientifically proven that people recall numbers less than seven digits at far greater levels than they do larger ones, and the recall is further enhanced by breaking it up into three and four digit “chunks”.

The number of measures shouldn’t be any different. In fact the word KEY in key performance indicators (KPI’s) suggests the need for that very level of focus. But for some reason, the design principle steering today’s KPI development seems to be favoring the “more is better” principle over more focused measurement design. In the last three weeks, I either spoke with or visited five companies that have an executive KPI “dashboard” in place. Four of the five organizations (and they were NOT alike in any way- different industries, geographies, and cultures – most had more than 15 KPI’s with one of those organizations nearing 40!

So here are some things to check for to ensure you have the right number and type of KPI’s

1. Don’t confuse “balance” with volume:

While organizations are encouraged to have a “bananced” set of KPI’s (e.g. a “balanced scorecard”), it does not mean that every business unit and functional workgroup in the organization’s structure needs to have the same degree of balance. Some functions exist for the sole purpose on moving one or two key indicators, and may legitimately have nothing to do with others. You’re better off with that group being responsible for 3-4 relevant indicators instead of a “balanced” suite of 25.

2. Don’t let the complexity of your metrics portfolio dilute the vision and compelling narrative of the business:

Some of the best companies out there have developed a short and compelling narrative or “elevator pitch” that encapsulates essence of the companies vision, mission, and strategic plan (our history, current vision, purpose, main points about strategy, and how we will measure success. What’s important here is the ability of the drive the “recall” of vision by the employees who are responsible for internalizing it and carrying it out. Better to have a few indicators they can relate to, internalize and influence than a multitude of indicators that go largely unnoticed.

3. Make the numbers mean something:

Often, that will mean avoiding the “index” or “roll up” type of indicators. The types of indicators often have meaning only to the person who built the underlying algorithm behind it. While it is ok to use these kind of indicators sparingly (perhaps at the high levels where they can be easily interpreted, I’d be inclined to get these indexes quickly translated into units that represent results. For example a CSI (customer sat index ) of 45 versus metrics like % of customers dissatisfied with service call, % rework, and first call resolution %. If you can create meaningful #’s, the need to measure a large number of “component” metrics typically goes down, freeing up attention to focus on the drivers and causal factors that will end up having much more impact on maximizing your PM dollar.

So there you have it, a simple list of three tips (not 5, 8 or 10, but 3)….hopefully simple enough to recall as you continue to improve your PM process.

-b

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Garbage In-Garbage Out…

One of the age-old problems we encounter as performance managers is one of data reliability. While it should be, intuitively, the most important aspect of performance management, it is, relatively speaking, given much lower priority than its more “sexy” relatives.

ERP’s, data warehouses, analysis engines, web reports…the list goes on. Comparatively speaking, each and every one of these important PM dimensions gets its fair shake of mind space and investment capital. But as the old adage goes, “garbage in/ garbage out” (GIGO). We all know that data quality is a necessary pre-requisite for any of these tools to work as designed. So why is it that so little time and attention goes into cleaning up this side of the street?

Tell me you can’t identify with this picture. You’re sitting in a Senior Management presentation of last quarter’s sales results. Perhaps you’re even the presenter. You get to a critical part of the presentation, which shows a glaring break in a trend which has been steadily improving for months. It signals the obvious- something bad has happened and we need to address it now! Conversation turns to the sales-force, the lead qualification process, the marketing department, competition,… 45 minutes later- no real clarity, except for lots of “to do’s” and follow up commitments.

Fast-forward two weeks (and several man-hours of investment) later. The Sales VP is pummeling one of his sales managers to “step up” the performance, and wants new strategies. A new commission structure is discussed, which brings in the need to get HR and IT involved. A few days later, when working on implementing some of the new strategies, a new story begins to unfold. An IT analyst, deep in the bowls of the organization astutely recognizes THE big missing piece of the puzzle. You see, last month, the manager of the Eastern Region changed the way he wants to see “sales-closes” reported (the way deals are essentially recorded), from one that is based on “client authorizations” to one based on “having the contract in hand”- a very useful distinction, particularly when viewed from a cash flow and accounting perspective. The only problem is that it was applied locally, not corporate wide, resulting in the apparent data anomaly.

Sounds a bit too simple for a modern corporation, well into the technology age. But unfortunately, this kind of story is all too common. We all understand the principles of GIGO, yet it continues to chew up corporate resources unnecessarily.

Overcoming the GIGO problem should be our number one priority- before systems, before reports, before analysis, before debate, and before conclusions are drawn. Before anything else, data quality is THE #1 priority.

Here are a few tactics for getting a solid “data quality” foundation in place:

1. Understand the “cost of waste”-

We measure everything else, why not measure the cost of poor data quality? Take a few of your last GIGO experiences and quantify what the organization wastes on unnecessary analysis, debate, and dialog around seemingly valid conclusions gone awry. This doesn’t have to be complex. Do it on the back of an envelope if you have to. Include everything that goes into it, including all the levels of management and staff that get involved. Then communicate it to your entire PM team. Make it part of your team’s mantra. Data quality matters!

2. Become the DQ (Data Quality) CZAR in your company-

Most performance managers got where they are by exposing that “diamond in the rough”. We got where we are by using data to be an advocate for change. It’s hard to imagine getting executive attention and recognition for something as “boring” as getting the data “right”. But that is what needs to happen. The increased visibility of post-Enron audit departments, SOX initiatives, and other risk management strategies have already started this trend. Performance Managers must follow. You need to embrace DQ as something you and your department “stand for”.

3. Create Data Visibility-

In some respects, this has already begun, but we have to do more. Our IT environments have the potential of disseminating information to every management level and location within minutes of publishing it. But let’s go one step further. Let’s “open the book” earlier in the process so more of those who can spot data issues earlier can participate in the game. What I’m saying here is that people have different roles when it comes to performance management. Some are consumers, and some are providers. It’s just as important to create visibility for the input factors, as it is to publish those sexy performance charts. You’ll get the input of that 4th level IT analyst I discussed above, much earlier in the process.

4. Utilize External Benchmarks Where Possible-

Benchmarks are often used within organizations to set targets, justify new projects, defend management actions, and to discover new best practices. These are all good and noble reasons to benchmark. One of the most overlooked benefits of benchmarking, however, is the role it plays (or should play) in your DQ process. I can’t tell you how many meetings I’ve been in where the presence of an external benchmark highlighted a key problem in data collection. Sometimes, seeing your data compared against a seemingly erroneous metric, can show major breakdowns in the data in cases where they would have otherwise gone undetected. Using comparisons to highlight reporting anomalies can be a very valuable use of external benchmarks.

5. Establish a DQ process-

It would be nice if all data were collected in an automated manner, where definitions could be hard-coded, and “what to include” would never be in question. But in most companies, that is simply not the case. Our research has shown that over 50% of data used in our performance management process is still collected manually. But very few of these companies have a defined and auditable process for doing so. This does not have to be complicated, as there are some very useful tools emerging that help collect, validate, approve, and publish required data, just as there are for data reporting and score-carding. Having a process, and system to ensure that process is followed, are both critical elements in data collection, and hence make for very good investments.

6. Don’t forget the Culture –

As I said above, most data, for the time being, will be collected in a manual fashion without fancy IT infrastructure. People will still be at the heart of that process. Invest time in helping them see the importance of the information they are collecting, how that information will be used, and what process will be followed to do so. Many organizations spend tens of millions on a systems solution to what is largely a people/ cultural problem. Investing in training and coaching can be as high payback as those mega systems investments.

* * * * * * * * * * * * * * * * * * * * * * * *

So as you navigate through your internal data collection efforts, try and keep these tips in mind. Sometimes, it’s the simple “blocking and tackling” that can make the difference between winners and those in second place.

 

-b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com


Managing Those Elusive Overheads

One of the biggest challenges faced by operations management is how to improve costs and service levels, especially when such a large portion of these costs are perceived to be “outside” of their control.

Despite recent attempts to control corporate overheads, it’s still very common for corporations to laden operating management with an “automatic” allocation for overhead costs such as Employee Benefits, IT, Legal, Facilities Management, Accounting…the list goes on. Our studies show that most of these costs are still allocated back to management as a direct “loader”, or percentage markup, on staff that is employed in the operating business units. Not only is this an unfair disadvantage to operating management who has little perceived influence on these costs, but it also results in a “masking” effect as these costs mysteriously get buried in the loading factor itself. Operating units struggle from year to year, trying to capture that next 1,2, 5 % of efficiency gains, while over 50% of their costs are, in effect, off limits.

But there are some organizations that clearly understand the challenges, and have begun to make nice strides in this area of corporate overheads. For some, it has involved ugly corporate battles, political in- fighting, and the “muscling in” of allocation changes. For others, the challenge has been a bit easier, by focusing on what really matters- visibility of overheads, and a direct path toward managing them.

Here’s a quick list of areas you can focus on to improve the way overheads are managed:

Transparency– The first, and most important driver for successfully managing overheads is making them visible to the enterprise. All to often, overheads from shared services functions are not visible to anyone outside of shared services organizations themselves. In fact, the word “overhead”, has an almost mystical connotation- something that just shows up like a cloud over your head.

One of my clients once said, “The most important thing leadership can do is to expose the ‘glass house’. Overheads need to get taken out of the “black box” and put into the “fish bowl.” Once you can see the costs clearly, both operating and corporate management can begin making rational assessments about to best control them.

Accountability– This is arguably one of the trickier overhead challenges, since managing overheads involves accountability at multiple levels. To simplify this challenge, most companies simply define accountability at the shared service level (VP IT, or VP Legal, for example) and leave it at that.

More successful organizations, on the other hand, split this accountability into its manageable components. For example, management of shared services functions can be accountable for policy, process, and the manner in which work gets performed. But there is a second layer that deals with “how much of a particular service” gets provided- and it’s that component that must be managed by operations, if we are to hold them accountable for real profit and loss (discussed below).

To do this right requires some hard work on the front end to appropriately define the “drivers” of overhead costs that are truly within line management’s control. A simple example is the area of Corporate IT, in which the IT department defines overall hardware standards and security protocols, while the variable costs associated with local support is based on actual usage and consumption of IT resources. That’s an overly simplified example, but still illustrative of how the process can work. Most overhead costs have a controllable driver to them. Defining those unique drivers, and distributing accountability for each will go a long way in showing how and where these costs can be managed.

“P&L” Mindset– There’s been a lot of debate around whether shared services functions can truly operate like real profit centers. The profit center “purists” will argue that internal services should behave just like “best in class” outsourcers, and if they can’t compete, they should get out the way. The more traditional view is that once a service is inside of the corporate wall, they become somewhat insulated from everyday price and service level competition. The reason being that “opening these services up to competition” would be too chaotic, and ignore the sunk cost associated with starting up, or winding down one of these functions.

A more hybrid solution that I like is to treat the first few years of a shared service function like a “business partnership” with defined parameters and conditions that must be met for the contract to continue. It takes a little bit of the edge, or outsourcing “threat”, off the table, and allows the operating unit and shared service function to collectively work on solving the problems at hand.

Still, shared services functions must look toward an “end state” where they begin to appear more and more like their competitors in the external marketplace and less like corporate entitlements. In the end, they must view their services as “universally contestable” with operating management as their #1 customer. For many organizations, particularly the larger ones, that’s a big change in culture.

Pricing– Save for the conservationists and “demand-siders”, most modern day economists will tell you that the “price tag” is the way to control the consumption of almost anything, from drugs to air travel. And it’s no different in the game of managing corporate overheads.

Once you’ve got the accountabilities squared away, and you’ve determined the “cost drivers” that are controllable by operating management, the price tag is the next big factor to focus on. One of the most important pieces of the service contract you have with operations management is the monthly invoice, assuming its real and complete. It needs to reflect the service provider’s true cost, not just the direct, or variable costs of serving operations. Otherwise, it’s a meaningless number. In the end, the pricing mechanism needs to be something that can be compared and benchmarked among leading suppliers of a particular service. For that to be possible, price needs to reflect the true cost of doing business.

Value Contribution– So far, we’ve only focused on the cost side of the equation. Now, let’s look at service levels.

For the more arcane areas of corporate overheads, where a pricing-for-service approach is more difficult, it is usually worth the time to understand the area’s value contribution to your business unit. Finding the one or two key value contributors is now the task at hand. For example, in US based companies, the Tax Department is generally staffed with high-end professionals, and often is the keeper of a substantial tax attorney budget. When treated from a pure cost perspective, a common rumbling among operating management becomes: Why am I paying so much for my tax return?

A better question would be: what value am I getting for my money? In this case, taking advantage of key US Tax code provisions can be expensive, but the cash flow impact (in terms of lower effective tax rates) can be a significant benefit to the operating unit. Clearly delineating and quantifying the value, combined with presenting an accurate picture of the cost to achieve that value (OH charges from the Tax department) can bring a whole new level of awareness to these types of overheads.

Of course, for this to work, you need to ensure that parity exists between the function benefiting from the value generated, and the function bearing the costs. So before you allocate costs, make sure you effectively match the budget responsibility with the function who ultimately reaps the benefits you define.

Service level agreements-This is the contract that manages the relationship between you and your internal service provider. It contains everything from pricing, to service level standards, to when and how outsourcing solutions can and would be employed. There must be a process in place to negotiate the standards, bind the parties, and review progress at regular intervals. While this can be a rather time consuming process (especially the first time out of the gate), it is essential in setting the stage for more commercial relationships between the parties.

Leadership– As with any significant initiative, competent and visible leadership is key. A good executive sponsor is key in getting through the inter-functional friction, and natural cultural challenges that will likely emerge during the process. Leadership must view controlling overheads as a significant priority, one that makes the enormity of the problem visible to both sides, and effectively set the “rules of engagement” for how to best address the challenges at hand. Without good leadership, the road toward efficiency and value of overheads becomes much more difficult to navigate

——————–
So there you have it…my cut at the top ingredients in managing corporate overheads and shared service functions. The road is not an easy one, but if you build in the right mechanisms from the start, you will avoid some of the common pitfalls that your organization is bound to face in its pursuit of a more efficient overhead structure.

b

Author: Bob Champagne is Managing Partner of onVector Consulting Group, a privately held international management consulting organization specializing in the design and deployment of Performance Management tools, systems, and solutions. Bob has over 25 years of Performance Management experience and has consulted with hundreds of companies across numerous industries and geographies. Bob can be contacted at bob.champagne@onvectorconsulting.com