Wednesday, 16 December 2009

BI and EPM come together in the 7.5 release of SAP BusinessObjects Planning and Consolidation

Hot off the heels of the announcement that Microsoft supports SAP BusinessObjects Planning and Consolidation, version for the Microsoft platform as a preferred solution for its customers, comes the latest edition of SAP BusinessObjects Planning and Consolidation. As a reminder, the application is available in two “flavors” – a version for the Microsoft platform and a version for SAP NetWeaver. The 7.5 releases of both versions have officially entered Ramp-Up (restricted shipment phase) and are available for customers that have applied and been accepted into the Ramp-Up program.

While there are a number of new features and enhancements in both the 7.5 version for the Microsoft platform and the 7.5 version for SAP NetWeaver, I’d like to call out a specific topic that comes up in conversations frequently with both users and prospective users of the application. Ever since Business Objects was acquired by SAP (January 2008), people have been asking if, when and how the SAP BusinessObjects BI tools can be used to access data within the SAP EPM applications to extend the reporting capabilities within the applications or reach new types of users that need to interact with the data contained in these applications to support decision making. The wheels had already been set in motion for this direct integration (I mean application layer access not simply extracting data out of the application into another data repository for reporting purposes) between SAP BusinessObjects BI tools and some of the applications in the EPM portfolio but was not yet available for SAP BusinessObjects Planning and Consolidation (with the exception of a workaround to embed Xcelsius dashboards in the application).

The situation changes with the 7.5 release. Integration between the SAP BusinessObjects BI tools and SAP BusinessObjects Planning and Consolidation is delivered in the 7.5 release of both versions (Microsoft and NetWeaver). Customers that have licenses for the SAP BusinessObjects BI tools can now use Xcelsius, Web Intelligence, Explorer and other tools to access information in SAP BusinessObjects Planning and Consolidation). For example, Xcelsius dashboards that display plan versus actual variances on key performance indicators, completed steps in a business process (yes, Business Process Flows are now available in the 7.5 release of SAP BusinessObjects Planning and Consolidation, version for SAP NetWeaver) or whatever information needs to be consumed in a “dashboard” layout can be created and embedded directly within the SAP BusinessObjects Planning and Consolidation application. This allows managers and executives to see and interact with high-level views of key information pertaining to planning, budgeting forecasting and financial consolidation processes. Budget contributors also benefit from the ability to input data in a simple, aesthetically pleasing, web-based schedule (via Xcelsius).

Integration between the BI tools and SAP BusinessObjects Planning and Consolidation doesn’t begin and end with Xcelsius. For example, using Web Intelligence, a business user can create a plan versus actual report on the fly (over the web) that pulls in plan data from SAP BusinessObjects Planning and Consolidation and actual data from a data warehouse and they could have this report saved and managed within their SAP BusinessObjects Enterprise environment (like all their other BI reports).

The question that typically follows, “Can I use the SAP BusinessObjects BI tools with SAP BusinessObjects Planning and Consolidation”, is, “When should I use the SAP BusinessObjects BI tools versus the reporting capabilities that come within SAP BusinessObjects Planning and Consolidation itself”? One reason to consider using the BI tools is for accessing and analyzing information from multiple sources of data including SAP BusinessObjects Planning and Consolidation. For example, as outlined above where a business user views a report/dashboard with plan data from SAP BusinessObjects Planning and Consolidation and actual data from an EDW. Secondly, organizations that already have the BI tools (there is a very large installed base of BusinessObjects users) may want to further leverage these investments or BusinessObjects is the standard for information access and they want to create, store and manage reports containing SAP BusinessObjects Planning and Consolidation data in SAP BusinessObjects Enterprise. A third reason to leverage the BI tools with SAP BusinessObjects Planning and Consolidation is to extend the reach of SAP BusinessObjects Planning and Consolidation information beyond the realm of Finance. While Finance loves Excel and Excel may be the most ubiquitous “reporting” tool on the planet, outside Finance there isn’t such an affinity. Not everyone is comfortable working in Excel and “casual” business users may prefer a cleaner, less complex interface to view and interact with information.
SAP BusinessObjects Explorer has been designed for just this purpose.


By TwitterButtons.com

Monday, 14 December 2009

Measuring success - picking what's important

When it comes to strategic planning and execution it is the execution part that is the hardest and therefore should deserve the most attention. This is obvious as it is much harder to do something than merely say you’re going to do it. What is even harder is staying disciplined and making an activity a regular practice. Therefore, when selecting software applications to support and enable strategic planning and performance management activities it is important to consider how the solutions in question address the execution end of things and not just planning and reporting.

Solutions that fit this bill will include functionality for defining objectives (or goals), key performance indicators (KPIs) - we’ll use this term over metrics as we want to include the notion of a target as well as an actual value) and initiatives (high level activities). Key Performance Indicators in detail will be the subject of another blog.

The inclusion of initiatives is the aspect most likely overlooked. Without initiatives linked to objectives it is all too easy to lose track of what action is being performed to ensure action is being taken to achieve an objective and the resources, both in terms of money and people that are being assigned to the task. Furthermore, when teams are being utilized from across an organization the individuals assigned to an initiative may not report to the objective owner. By assigning owners to the initiatives visibility and accountability is achieved. Without this it is harder for an objective owner to manage work being performed on the initiatives in place.

Deciding what you are going to do (objectives) and how (initiatives) is fairly easy compared to defining suitable KPIs. However, it is the KPIs that are key to success (no pun intended). Selecting one or more metrics that truly measure performance on an objective determines whether you are really successful or just think you are. Let’s take an example from the game of soccer. It order to play soccer you need to be fit – pretty obvious. But what you really need is to be able to alternate between walking, jogging and then sprinting at high speed for short durations over a 90 minute period. So your objective is to be fit enough to be able to last 90 minutes and contribute throughout the game especially in the parts that involve sprinting and chasing after the ball. So you (or your coach) might decide that in order to meet this objective your training schedule (initiatives) should include lots of running. One way to do this might be to incorporate running fixed distances, with the goal of reducing the time it takes, into your training schedule. For example, reducing the time taken to run four miles from 36 mins to 28 mins. The problem here though is that even though you might meet your “goal” in running faster does it help you play soccer better and ultimately help your team win. It probably helps somewhat but you would be better served including drills that improve your sprinting ability along with the ability to keep playing at a competitive level for 45 minutes at a time.

We can (and should when thinking about KPIs) take this to another level. The fitness objective and associated KPIs are obviously important in a sport such as soccer. This would fall into the category of an output matric. Output metrics measure activity or results but not specifically the desired or eventual outcome from such activity. Examples from the business world would include number of widgets shipped or for the public sector the number of people vaccinated, and from our soccer example the number of goals scored or allowed. So fitness is a step towards winning soccer games but being fit, while likely helping, does not ensure or measure the ultimate goal of winning games or a championship. It may be (although not likely) that a much less fit team ends up being the winners.

In our training schedule for our soccer team we will likely have a number of input metrics (e.g. how many times we practice, how many times we run certain drills) which measure the effort and activity we are putting into our training as well as some output measures. When it comes to the actual matches our team plays we’ll also track some output measures, e.g. goals for and against. But what we really care about is the outcome measures. To this end winning is more important than scoring lots of goals. In this sense it is better to be the team that wins the most be it with low scoring games then one that scores lots of goals but doesn’t end up winning as many games (even though scoring goals is fun).

Understanding the distinction between input, output and outcome KPIs is an important step towards helping pick the right measures for your objectives. Deciding on and creating good KPIs are one of the hardest parts of strategy definition and something many organizations struggle with.

SAP BusinessObjects Strategy Management provides organizations with many capabilities to enable their strategic planning process and manage the execution of strategy.

Friday, 27 November 2009

Delivering scenario analysis in closed-loop EPM

There is broad agreement that the core methodologies that make up Enterprise Performance Management are budgeting, cost and profitability reporting, strategy management and financial reporting. There is also a lot of discussion of the importance of providing business managers with scenario modeling functionality so they can rapidly assess the impact that changes in internal or external factors have on future financial performance.

In fact as long ago as 2004, analysts Aberdeen found that providing such functionality was a key discriminator in the success of an EPM implementation with Best in Class companies delivering 5.4 percentage points more than their industry norm in return on assets and 4.8 percentage points more in gross margin. So if scenario modelling is so important, how exactly is it delivered?


Douglas T Hicks* and other writers have consistently pointed out that the application of the cause-and-effect principles that underpin activity-based costing continues to be the only method of creating a valid economic model of an organization. A well-designed model is based upon empirical relationships between key business volumes, (sales volumes, number of customers, invoices and returns), consumption and productivity data (held as resource and activity drivers and cycle times) and expenses, revenues and ultimately profits. But a cost and profitability model is not simply a turgid repository and classification of enterprise data. As Hicks indicates, it is a dynamic economic model of the enterprise inside which changes to any piece of data directly impacts other pieces of data and consequently financial outputs. In fact without an activity-based model, most organizations simply have a void.

 
Some organizations with cost and profitability models already extend their use beyond reporting on the historical cost and profitability of products and customers by using their “what-if?” functionality to test assumptions about the future. By modeling anticipated sales volumes and varying inputs such as the average salary cost in each department and the underlying business drivers to reflect changes to processes and productivity improvements, they can quickly generate a revised resource and capacity plan and a new high-level profit and loss account. The output of this type of activity-based or driver-based modeling does not typically go down to the detailed level of chart of accounts that is produced as part of the annual planning and budgeting process and excludes new business and other strategic initiatives that require zero-based or bottom-up budgeting. However it does provide significant benefits for generating the operating expenses for ongoing businesses that will expedite the more detailed budgeting process. I’ve recently been working with three organizations who have adopted such an approach – two from the public sector and a bank and the benefits they are reporting are

  •  More accurate resource plans and budgets – which is a definite benefit for public sector organizations as it means they don’t have go back to central government with cap in hand for more funding during the year.
  • Shorter budgeting cycle in that business managers don’t have to build up their budget submissions themselves and can simply review many of their key line items – particular those for staff and other variable expenses.
  • Greater agility in that they can finally move to managing the business with rolling quarterly re-forecasts. Personally I have struggled to understand how rolling re-forecasts can ever be delivered unless some kind of modeling is involved.
So this is encouraging news for those folk who have been advocates of driver-based approaches to planning and budgeting and with the advent of finance-friendly tools such as SAP BusinessObjects Financial Information Management for moving data between previously silo-ed EPM applications, it’s now easier to deliver. Using such an approach also achieves a happy half-way house between those who criticize traditional budgeting as being unable to cope with today’s uncertainty and those who still regard it as the main performance management tool in their organization. Everybody wins.

* "I May Be Wrong, But I Doubt It: How Accounting Information Undermines Profitability", Douglas T Hicks, Lulu Publishing, 2008, ISBN 0557031591


 

Wednesday, 18 November 2009

SAP and Microsoft Join Forces in Enterprise Performance Management

You may recall that in January 2009, Microsoft announced it would be discontinuing PerformancePoint Server, its standalone BI/EPM offering. Many of the elements of PerformancePoint will be bundled with Microsoft SharePoint Server; however, the planning module/capabilities within PerformancePoint will no longer be developed. This announcement surprised many, and industry analysts like Forrester’s Paul Hamerman, who wrote about the news, suggested that companies evaluating Microsoft’s planning, budgeting and forecasting capabilities should now look elsewhere. But where exactly should they look? While there are a number of vendors that provide planning capabilities based on Microsoft technology, they are smaller, niche vendors. What’s been missing is a Microsoft strategic partner with global reach, delivering a suitable offering to support many of Microsoft’s enterprise customers.

Until now that is, as SAP today announced that is has joined forces with Microsoft on the EPM front; more specifically, Microsoft supports SAP BusinessObjects Planning and Consolidation, version for the Microsoft platform (formerly known as OutlookSoft and Business Planning and Consolidation, and often referred to as simply, “BPC”) as a preferred planning, budgeting, and forecasting application for its customers. As part of a strategic partnership the two organizations plan to engage in joint go to market initiatives and collaboration on the product front – in fact some of the collaboration has already begun . This is a very exciting announcement at a time when according to research recently conducted by
AMR Research and SAP, many organizations see investments in planning, budgeting, and forecasting solutions as their most strategic priorities in 2010. You can read the announcement here: http://www.sap.com/solutions/sapbusinessobjects/large/enterprise-performance-management/newsevents/index.epx

So what was the genesis of this partnership? This announcement is the culmination of discussions that began soon after Microsoft ‘s announcement. As already mentioned the move surprised many and left Microsoft customers wondering what they should do when it came to selecting a planning application or in the case of services partners/value added resellers, an application to align (re-align) their practice around. The only leading (as recognized by industry analysts such as Gartner and Forrester) unified planning and consolidation application built on a Microsoft platform is SAP’s BusinessObjects Planning and Consolidation, version for the Microsoft platform (built on Microsoft Office i.e. Excel, .Net and Sql Server). Over 1400 customers are already using the application (the installed base has more than doubled since SAP acquired OutlookSoft in 2007) or the recently launched SAP BusinessObjects Edge Planning and Consolidation, designed for small to midsized enterprises and also built on the Microsoft platform.

This announcement should also help answer a question that has been lingering in many people’s minds ever since OutlookSoft was acquired by SAP, and the application was ported to the SAP NetWeaver platform (in case you weren’t aware, there are now two versions of the product – the successor to the OutlookSoft product built on the Microsoft platform and a new version built on SAP’s NetWeaver technology platform): Is the Microsoft version of SAP BusinessObjects Planning and Consolidation going away and/or not being invested in? The answer is no. SAP’s roadmaps have clearly stated this - much of the confusion surrounding this has been fueled by competitors trying to spread fear, uncertainty and doubt in the minds of existing customers and prospects. So let’s set the record straight - the reality is that SAP is continuing to optimize its EPM solutions for SAP and non-SAP environments and SAP BusinessObjects Planning and Consolidation, version for the Microsoft platform is a key part of the roadmap moving forward. The SAP and Microsoft partnership around SAP BusinessObjects Planning and Consolidation is testament to this.

Industry analyst and media comments on the announcement have been positive (http://blogs.wsj.com/digits/2009/11/17/microsoft-and-sap-again-team-up-against-oracle/ , http://seekingalpha.com/article/173987-microsoft-and-sap-vs-oracle-a-very-big-deal, http://www.amrresearch.com/content/view.aspx?compURI=tcm:7-49246&title=SAP+Partners+With+Microsoft+for+Planning+and+Consolidations) but what does this partnership mean to you if you are an organization looking to invest in a planning solution? To begin with you can rest assured that your investment is safe – as mentioned above, SAP is continuing to invest in the SAP BusinessObjects Planning and Consolidation, version for the Microsoft platform. Secondly, SAP and Microsoft product teams plan to collaborate focusing on continuing to optimize the application for the Microsoft platform and leverage key Microsoft technology components. Thirdly, you have an option to consider that is backed by two of the leading global software organizations.

Tuesday, 3 November 2009

'You don't need a weatherman to know which way the wind blows’

Most national metrological services have the most powerful computing power available in their country. The system installed in the UK Metrological Office takes up the area of two football pitches, is capable of 125 trillion calculations per second and is reported to take two months to boot up. This has enabled it to incorporate more data into its predictive models and to forecast down to 5km by 5km zones – four times more detailed than the current granularity of 10km by 10km zones. Such advances have definitely bought benefits and today the 72 hour forecast is more accurate than the 24 hour forecast was a quarter of a century ago – see inset photograph to be reminded of what a weather forecast looked like at the time!

The amount of information the Met Office can now produce has undoubted use in science and commerce, but the British public remain unconvinced. In fact they are starting to vocalize their frustration at the amount of detail being communicated in radio and television forecasts which is causing them to mentally switch off and not absorb any information at all.

In response to listener’s suggestions, BBC Radio 4 has been testing a new way of reading a weather forecast, clearly flagging the region being discussed before giving a concise summary of how the weather will change in the next 24 hours, much like a shipping forecast. That’s all most of us really want to know to help us make decisions about what to wear and whether we should plan a round of golf or go see a movie. Those involved in activities such as sailing and mountaineering who need more detail can get it from more specialist forecasts available online.

There is an obvious lesson for all of us involved in performance management here. Reporting tends to come as an afterthought and all too often we inundate business users with data that they don’t have the time to digest and most of the time don’t actually need. Worse still, we don’t invest the time in observing how business users consume information for decision support and how they prefer to have it presented, (in my experience that typically means as ratios and trends so that exceptions and changes can be quickly detected).

Self-service business intelligence tools such as SAP BusinessObjects Explorer now means that users can gain immediate insight into vast amounts of data on demand and visualize it in a way that makes sense to them personally improving their ability to make timely and informed decisions. So if you have budgeting data generated in SAP BusinessObjects Planning and Consolidation or cost and profitability data generated in SAP BusinessObjects Profitability and Costing Management then using tools like SAP BusinessObjects Explorer make a lot of sense especially when you don't quite know exactly what information you want to analyse. For me this is better than having to rely on someone else prepare reports or dashboards for my consumption as all too often they get carried away with the technology rather than focusing on the needs of the business user. Perhaps we should invest in some training in information design and simplification techniques for them – something that would have undoubtedly benefited those responsible for our broadcast weather forecasts.

Friday, 23 October 2009

IFRS adoption - lessons from the European experience

I’ve been monitoring the adoption of International Financial Reporting Standards (IFRS) for several years now staring with Europe’s transition to IFRS in 2005. However, nowhere is IFRS hotter right now than in the United States. Last year Christopher Cox, the outgoing chairman of the US SEC, published a roadmap for IFRS adoption starting in 2014. Now, after some apparent silence from the SEC, Mary Schapiro has recently indicated that we’ll likely be getting a final decision on adoption in the near future.

The question I’ve been hearing a lot recently from customers (it’s generally assumed that IFRS will be required in the US at some point so it’s just a matter of when) is what do folks need to do to get ready? This is where things get a little tricky. Every organization is different – both from a structural, organizational, and a technology perspective. There’s no “silver bullet” that can turn on IFRS at the flick of a switch - the scope of IFRS is simply too broad.

Interestingly some recent AMR research indicates that organizations in the U.S. are not well prepared for IFRS adoption. Only 9% are implementing now with the vast majority of respondents (63%) yet to define an IFRS strategy or waiting until IFRS deadlines are well articulated.

From the technology perspective, I think we can learn a lot from what already happened in Europe – both good and perhaps less good practice. The timeframe for adoption was very short (far shorter than we might expect in the US) so the vast majority of organizations looked for a quick fix. Typically they decided to leave the transactional and ERP systems well alone and only perform adjustments in the consolidation and reporting layer to provide IFRS specifics as and where needed. This is the approach often referred to as “top-side adjustments”. This allowed companies to meet the initial deadline in the easiest way possible but increased the complexity and cost of reporting in the medium and long term.

In the USA, a lot of folk seem to be focusing their efforts around getting the ERP and General Ledger systems running in parallel to support the expected dual US GAAP / IFRS reporting phase. However I think this approach is incomplete. In order to transition to IFRS all aspects of the financial reporting supply chain need to be considered – from sub and general ledgers all the way through to the group consolidation and reporting systems. The more US GAAP and IFRS data you have at the transactional layer the easier it actually becomes to transition the consolidation and reporting layer and with this approach there should, in theory, be no need for top-side adjustments. However this represents the perfect idyllic world - which for most organizations is still a distant reality. The group reporting layer doesn’t simply exist to enable top-side adjustments - it also provides a comprehensive group reporting mechanism that addresses the IFRS specifics of consolidation methods, reconciliation reports, commentary, report layout, - right through to digital disclosure in the form of XBRL. All of this is remains an essential part of the process irrespective of whether all required US GAAP / IFRS information is readily available in the underlying transactional systems.

So my point here is – organizations should study and learn from the European experience, but resist the temptation to perceive the consolidation and reporting layer purely as the ugly stepsister that at best enables a short term solution via top-side adjustments. Instead, think about how to get the best from all worlds and how each area can make a relevant and lasting contribution to your IFRS strategy. Take a long term view and start focused – perhaps by considering a gradual, phased approach across the entire reporting supply chain.

One thing is clear though – starting the strategy and planning process now will leave time to take a balanced view and avoid the asphyxiating pressure associated with an imminent deadline that feeds short-sighted “tick in the box” solutions. It was the short deadline that resulted in the top-side approach in Europe – something that’s not expected in the U.S.

I recently published a White Paper that discusses some of the technology challenges associated with moving to IFRS, so if this is on your radar and you’d like to discover more, please do check it out.

Monday, 19 October 2009

Small steps to big improvements

So just what is it that makes a successful user of performance management software? Financial performance, market share, efficiency improvements, share value, cost savings…are all viable measures of success. But as we know, a business comprises many parts and key performance measures are often combinations of small successes that build upon each other to deliver a corporate goal. Sometimes it’s also nice to hear about these small successes.

Last week I met with a customer that I haven’t seen for a while to talk about their business and how they were using their performance management system. As we spoke it became clear that they were very happy with the software system and were doing some interesting things that enable them to improve the way they run their business.

My attention peaked when the conversation turned to data integration, a subject close to my heart as it presently the focus of much of my work. I was interested to learn that they were not only integrating data from source systems to their performance management application, but also integrating individual performance management applications - in this instance planning and costing. This was great to hear, because it showed that they had innovated their system usage to draw business advantages from integrated systems, and while they were only exchanging simple data this was already providing tangible business benefits primarily in terms of timeliness and accuracy that would help them make quicker and better decsions. In addition they had saved some time and ultimately cost from what had previously been a manual process. Now of course this is just one small success, in one department of this large organization, but knowing this customer it’s not the only small success. It’s one of many.

As the software industry starts to create business systems that support closed loop performance management this kind of success story will become commonplace and SAP BusinessObjects Financial Information Management is going to be an important part of an EPM deployment for many SAP users. Big or small, we should reflect on our improvements and successes along the way, because it’s in these small steps that we will build a foundation for continuing success.

Tuesday, 13 October 2009

Deja vu? Almost certainly

I’ve written on this topic many times in recent years and no matter how much progress I think is being made I still find myself with something new to say on the subject, as well as lots of old messages that I feel compelled to keep repeating. No matter what remedies we seem to devise the challenge remains and yet I’m not talking about some kind of embarrassing illness. The fact is the financial close is one of those topics that just won’t go away.


Why is that? Well for one the goal posts keep getting moved. The demands of the market put constant pressure on ever more timely and better quality reporting and regulators are working overtime to introduce new requirements designed to protect stakeholders of all persuasions. And that’s even before we consider the sheer complexity that today’s Financial Controller has to cope with when it comes to producing a set of audited financial statements. So what is the solution? Is there one?


Well there is certainly no silver bullet, no magic blue or red pill or even software solution that can solve all the issues that together form the barriers to a fast and efficient close process (despite what many would have us believe). That said I think we now have enough evidence and enough time has passed for us to come to realize that actually the solution lies in knowing that the challenge can never be solved. This enemy will never truly be beaten…or at least not while I’m still able to write about it.


As I said, the fact is the financial close is one of those topics that just won’t go away and it got me thinking about the last whitepaper I wrote on the subject where our closing argument is about continuous improvement. It’s become clear to me that this is actually the key. While all our thinking about how to approach a fast close project is as sound as ever, and its certainly something that an organization has to do in order to close its books and report in-line with its peers, it’s the steps an organization applies afterwards that will be the key to their continued success and improvement even in the face of an ever changing regulatory landscape.


Any company that has been through a fast close project that followed the methodology we prescribe will have defined a target. They will have looked at their peers in the market and for example have set themselves the goal of a 5 day close process. They will have then examined their close processes, identified the barriers to a fast close in the organization and will have defined an action plan comprising quick wins and big wins to get them to their goal. The key to their ability to maintain this target and perhaps improve further then lies in both continuous and opportunistic improvements and this is achieved by repeating the same approach every time something changes.


Let’s take a case in point, the transition to IFRS in North America or Japan, or perhaps even the adoption of XBRL on a global basis. Now I’m not going to get into the details of how we should tackle these issues here, I’ll leave my fellow blogger Philip Mugglestone to cover that but what’s interesting is that both these changes will impact reporting processes and as a result might have an impact on the speed and efficiency of the close process. So while the patient is on the table, take the time to consider that potential impact. Will it give rise to a new fast close barrier that may not have previously been encountered? What steps need to be taken to not just mitigate the risk but also determine if this wider change creates an opportunity to address something else in the close process that was not possible in the original project?


Of course, every time I’ve shared this opinion recently it then raises the next challenge which is one of cost. Too much regulation and a cost of compliance that is too high. The solution many think is to get compliant as quickly as possible and at the lowest cost possible. Not so in my opinion. If we are to learn from both SOX and the transition to IFRS in Europe an approach which just “ticks the box” will in fact cost more long term.


So what’s my point? Essentially I’m saying that our work is never done and to prove that we only have to look at the Global Close Cycle Rankings from BPM International. This annual research program has 4 years of data on 4th quarter close process at over 1000 global companies. While there seems to be a trend for general improvement, every year the results show a mix of both improvements some countries and worsening performance in others as companies who had previously improved close times now see them increase. This underlines the need to develop an approach which strives for continuous and opportunistic improvement so not only will a company be able to maintain it close times in the face on an ever more complex environment, but they’ll also reap the benefits of creating a more sustainable close process which will in turn be more adaptable in a changing landscape.

Friday, 9 October 2009

Why peanut butter can be bad for you!


Are your customers profitable ? Whenever I ask this question to business executives the immediate response is always 'Yes'. The comment that typically tends to follow “How can a customer not be profitable? They are purchasing our products and services. They are contributing to our bottom line “

Simply put “Customer Profitability” (CP) is the difference between the revenues earned from and the costs associated with the customer relationship in a specified period. According to the management guru, Philip Kotler, "a profitable customer is a person, household or a company that overtime, yields a revenue stream that exceeds by an acceptable amount the company's cost stream of attracting, selling and servicing the customer".


There is a natural variability of profitability across customers and it is often hard to distinguish the “Profit Creators” from the “Profit Destroyers”. In fact, most organizations are likely to have some customers who do not add to the bottom line, but who they are happy to deal with because they add critical mass to the business or because they are a source of knowledge about the needs of a particular customer segment.


The real challenge in measuring customer profitability is identifying the true costs associated with servicing customers and then accurately assigning those costs across customers. The ERP and Performance Management systems that are typically deployed in companies do an excellent job of identifying the revenue and direct costs associated with each customer. However they fall short of answering the fundamental question of “What is the cost of servicing this customer”. Most companies tend to apportion the overhead costs, almost like spreading peanut butter. They base it on simple metrics like number of transactions, percentage of total cost of sales etc.


Activity Based Costing can be used to answer these tough questions around costing. It can help identify the right amount of indirect expense to assign to a customer based on the amount of each activity they consume. Let us take a look at a simple example where we analyze customer profitability using the apportionment methodology and Activity Based Costing. We have two customers: A and B. If you review the Customer Profit & Loss based on the two methodologies the results are very different.



When using the apportionment method, the overhead costs are apportioned as a percentage of cost of sales. Based on this it looks like Customer B is the profitable customer. When we look at the same Profit & Loss Statement based on Activity Based Costing the case is totally different. Here we have broken down all the costs associated with servicing the customer. Eg: Sales Calls, Shipping, Packaging etc. In reality, Customer B has a very high “Cost to Serve” and is in fact making a loss. This is fairly typical outcome with apportionment tending to overcost simple customers and undercost those with complex requirements and demands.


So without ABC, many organizations are generating totally erroneous reports that can actually lead to managers taking decisions that destroy rather than create value. Not exactly smart - particularly in a downturn. The methodology has been around for 25 years now and it's something that organizations can no longer afford to side step if they want an accurate measurement system for reporting customer profitability.

Tuesday, 6 October 2009

From Cash to Carbon


After spending many years using costing methodologies to calculate the profitability of products and customers in numerous industries, it's a welcome change to have the opportunity to work with organizations that are adopting the same activity-based methodologies to calculate the carbon emitted across the life cycle of products and services. 

Although there is no legislation yet, there are a number of initiatives across the globe that are driving food and beverage manufacturers and some service providers to start 'eco-labeling' as it's called. This is throwing up some interesting results that will hopefully encourage us to change our behaviour and do our bit to save the planet. 


For instance the Paris Metro now publishes the amount of carbon emitted by each journey on the ticket and compares it with the equivalent journey by car. As the attached image of a ticket shows, a 23 minute by Metro results in 15g of carbon compared with 815g for the same journey in a car. So unless you can get fifty or so people in the car, it's better to take to the tube - or perhaps cycle. (PS I accept no responsibility for any fatalities that may result from cycling in Paris).

Some of the initial calculations done by some food manufacturers are also throwing up some stunning results as well - with the production, supply and consumption of a drink or food often emitting many times its own weight in carbon.

But unlike cost and profitability reporting where the expenses fed into a model can always be reconciled with the assigned costs, carbon labeling is not yet a precise science. This will undoubtedly result in manufacturers of almost identical products coming with with varying results depending on how they have defined the boundaries of their life-cycle assessment and the carbon coefficients used in the calculation. 

Some writers have suggested that this may discredit eco-labeling and confuse consumers. But for me the real issue is not whether comparable products from two different manufacturers show broadly similar amounts of carbon, but a benchmark figure for the amount of carbon emitted by a far more eco-friendly alternative is also shown just as in the Metro ticket above. Only then will consumers have the information they need to make truly informed decisions.