Activity-Based Costing (ABC) can provide tremendous insight into the financial performance of your organization. However, it has had a checkered past due to factors including the complexity of models and the large amount of manual effort required to maintain these models. Activity-Based Costing also fell out of favor due to very low interest rates and the “growth at any cost” mindset.
Global economic conditions have changed quite a bit in only a few short years, returning to higher inflation and interest rates similar, although not as bad, to the conditions of the 70s, 80s and 90s. Back then Activity-Based Costing was used extensively. Now companies have a renewed focus on efficiency, profit and sustainability and Activity-Based Costing is the perfect solution to re-introduce into organizations. The trick is figuring out how to start, particularly if you have never implemented an ABC model before. We won’t go into the methodology of ABC models, there are plenty of online courses and books available to provide the basics. What we can provide is guidance on best practices we have discovered over more than two decades of building very large models for very complex organizations across a range of different industries including oil/gas, military (Australian and US), government, telecommunications, gaming, insurance, and more recently universities. These models can easily have 100s of millions of allocation paths, so definitely not a spreadsheet exercise, but you can start simply in a spreadsheet and then grow and evolve the model over time.
Activity-Based Costing is a very simple concept and is easy to learn, but difficult to master, because it takes time to learn the best ways to do things. They say a smart person learns from their own mistakes, but a wise person learns from others’ mistakes, so please see below the top lessons we have learnt, the hard way, over 20+ years of building Activity-Based Costing models.
1. Don’t boil the ocean
This was a phrase used by the E-liability Institute when it came to building Environmental models for organizations. Essentially, don’t try and do everything all at once. Start simply and grow and evolve the model over time.
Our Higher Education models are a classic example of how we have evolved to simplicity over time. When we first started building these models, we would take data from a wide range of sources Finance, HR, Payroll, Facility Management, Student Management, Timetabling and build wonderfully large models. The problem is that these large models are confusing to the end-users, and if the user doesn’t understand the model, they don’t trust the model and if they don’t trust the model, they don’t use the model. So, a much better approach is to start with a simpler model, with fewer data sources and take the end-users on a journey of learning and discovery and get their input on how to grow and evolve the model over time.
2. Scoping Study
We always start with a scoping study; this is a two-week analysis of the data required and boundary of the model (what’s included and what’s excluded). Remember, if you exclude parts of the organization now, you can easily add it later. The objective is to avoid analysis paralysis, that is spending months or years developing a detailed specification of the model to be built. It is much better to take a lean approach, start quickly and develop a “minimum viable product” model. Once this is developed you can solicit feedback from the organization, when they can see something working albeit simplistic to start with.
When the Scoping Study is complete then management can perform a go / no-go check. This is the decision point to continue with building the quick simple model or whether you have identified issues inside the organization (usually data related) that need to be resolved first before progressing to a model.
3. Lots of Quality Data
Good quality data is the key to a good model. The objective is to get as much detailed data for model inputs (resources) and model outputs (products/services) as well as data that can be used in Driver formulas. During the scoping study is when you can analyze all this data to check for quality, consistency, and completeness.
One of the major side-effects of building Activity-Based Cost models and reviewing data quality is that data errors can be easily highlighted. An example of this was while building of one of our US Navy models where a code, used for the allocation of munitions to a bombing range, was mistyped and the munitions were allocated to the childcare center. I should stress that this didn’t mean the munitions were physically shipped to the kiddies, just the financial allocation. So, when we built our model and showed the results to the support services group responsible for childcare they could quickly and easily identify this error, with a few puzzled looks.
4. Avoid Surveys – Develop Work Profiles
People are usually the biggest expense in organizations and a lot of organizations don’t have detailed timesheets. If timesheet data does not exist, the instinctive fix is to go and survey people to find out what they do. I remember when time and motion studies were implemented, they were horrible, breaking up jobs into minute sub-tasks and figuring out how many minutes or seconds were spent doing these tasks. It took an extraordinary amount of time to collect this data and it changes, all the time! This type of data can easily be used for machinery or robots where data is collected in real time for all of their minute tasks but is impossible for people.
We have discovered that the best thing to do in this situation is to build up work profiles, based on the type of worker and the jobs they usually have to do and run this on a “management by exception basis”. That is, create the one profile for all individuals that do that particular job, but let everyone see their specific profile. If they want to change it, then they can, if they want to accept the standard they can. This is a type of survey but is nowhere near as manually and time intensive as surveying everyone. We have discovered over the years it is so much quicker and easier to assume things up front and get people to tell us what’s wrong, rather than start with a blank piece of paper and get them to tell us what they do.
5. Minimize Activities
It may sound counter-intuitive that you want to minimize Activities in an Activity-Based Cost model, but we have discovered over the years that a small number of key Activities provides just as good a result as many detailed activities. The major problem with having many activities is that it requires a much larger manual effort to build and maintain the model.
A classic example from our past is the development of the Royal Australian Navy model, this was a large multi-year project and covered the entire Navy. The model was developed with lots of activities, we’re talking thousands of activities and we had to literally fly around the country and interview people to find out what they did, across thousands of activities, it was a large, manual, painful process for both interviewer and interviewee.
6. Automation
You want to automate model allocations as much as possible. As has been demonstrated in many failed ABC implementations, the greater the manual effort in maintaining and updating the model, the more likely it is that the organization will abandon the model. As mentioned in the data section, you want to find those key data sources that can provide metrics that can be used to allocate costs using cause-and-effect relationships – this includes metrics like Square Foot / Square Metres of room/building space (Driving expenses like leases, cleaning, maintenance etc) or Volume metrics from production systems or monitoring systems to calculate resource consumption and therefore cost. What you want to avoid is manually counting any of these metrics, you want to extract all this data from existing systems. And you want to be able to repeat this regularly with minimal manual intervention.
An example of this is in our Higher Education models where we have different types of students like International, Domestic, In-State, Out-of- State etc. We can get the type of student and how many students from the student management system. If there is a specific expense associated with, say, international students then the old way of modelling would be to find what those international students are studying and allocate that expense to those schools/courses. The automated way is to make one allocation to the top of the model and let it find the international students and the number of students and make the allocation itself.
7. Minimize Drivers
In the initial stages, you want to focus on a core number of key drivers and, as mentioned previously, automate them as much as possible. Just like activities, organizations can get carried away identifying a wide range of drivers to use, even worse if the driver metric has to be manually collected or calculated.
An example of this is the old Oros ABC modelling solution that we used regularly back in the old days, but one of the shortcomings of the product was that a number of users would calculate driver quantities outside the modelling engine in Microsoft Excel and them import those quantities back into the model. This added a large amount of additional manual data processing, which can also lead to errors in the calculations.
8. Use Different Value Types
Many ABC models only use ‘Expense’ which is typical for a model focused on cost, they may even introduce ‘Revenue’ for margin calculations, yet there are many more that add significant value to models. Being able to uniquely treat and report on other values such as Full Time Equivalents (FTE), square feet/metres, different currencies, environmental metrics such as CO2 equivalent etc. Each of these in turn can become the basis for allocation methods in the model, for example, allocating depreciation using a weighted Square Metre driver, or HR services using a headcount or FTE driver etc. Finally, all of these values are then reportable, that is, the final outputs of the model can be looked at from not only a revenue/expense/margin point of view, but also from a metric consumption point of view – how many FTEs or SQM did product 1 ‘consume’ compared to product2 etc.
As an example, our oil/gas decommissioning models have a wide range of different value items, and a small selection is shown below:
- Cost GBP
- Cost NOK
- Cost USD
- Day
- DownDays
- Litres
- Trips
- Volume
- FTE
9. Materiality
This is key to the initial model build, but also for more detailed models, determining a specific cut-off point of expenses that are just not worth allocating, that might be a percentage (we won’t worry about costs below 10% of total expenditure) or a fixed dollar amount, or those spent using specific account codes etc. Many of these expenses can be rolled up into one ‘account’ that can be allocated using a simple driver rather than having 100’s of smaller accounts that add bulk, yet little value, to your model. We have discovered over the years that tracing small expenses can lead to a significant increase in manual effort with minimal impact on the overall results of the model.
When building the Australian Navy model, we didn’t have a materiality threshold initially and would have 1% allocations, we even introduced a “partitioning” methodology so we could get even smaller allocations. This all resulted in a VERY slow calculation process, in fact, it never finished. A quick assessment of the amount already calculated compared to what remained, meant that it would take approximately 1,500 years to finish the calculation. We quickly resolved to fix this issue and set materiality thresholds.
10. The Models Need Not Be Precise
A quote from Professor William F. Massy in his book “Resource Management for Colleges & Universities” is “The models should be intuitively reasonable, but they need not be precise. The ultimate test is whether their use will lead to better decisions than judgments made without the aid of models.”
And a quote from the bible of Activity-Based Costing “Cost & Effect – Using integrated cost systems to drive profitability and performance” by Robert S. Kaplan and Robin Cooper states “The goal of a properly constructed ABC system is not the most accurate cost system. Consider a target, where the bull’s-eye represents the actual cost of resources used each time a product is made, a service delivered, and a customer served. To hit the bull’s-eye each time requires an enormously expensive ABC system. But a relatively simple ABC system – perhaps including 30-50 activities and using good estimates…should enable an organization to hit consistently the outer and middle rings…that is, activity and process costs will be accurate to within 5% or 10%. Stage II cost systems, in contrast, virtually never even hit the target, or even the wall on which the target is mounted, because of their highly distorted costs.” Stage II cost systems they refer to are Standard Cost systems where a standard cost is developed to represent cost based on some type of volume metric e.g. cost per ton-mile, hourly cost of cotton to thread yarn, cost per pound of output etc.
Any other tips?
This list is by no means exhaustive, if you would like to recommend your own hints/tips please feel free to add them in the comments section below.
Timeframe and Level of Effort
Here are some specifics to guide you on your path to developing models. The old way of doing things where organizations would develop a detailed specification and then roll out the model could easily take multiple years, I’ve seen instances of up to five years+. It can sometimes take one whole year just to develop the specification.
The following timeframe is based on our experience and absolutely achievable.
Scoping Study – 2 weeks
One dedicated lead analyst – Project Manager
One dedicated data analyst
Senior Executives (CEO/CFO/COO etc) – 1 hour – Provide guidance on specific company requirements that the model needs to capture. It’s important to secure high level support for this type of model because it will impact all aspects of the business and there will be certain individuals in every organization who will not cooperate.
Source System Owners – 1 – 1.5 hours per system – Discuss data available from each system, limitations of the data. Key systems will include Finance, HR, Payroll, CRM, production systems (company products / services itemized), measurement systems (volume metrics for drivers).
First Iteration of the model – 3-5 weeks
One dedicated lead analyst – Project Manager
One dedicated data analyst
Heads of Departments – 1 – 2 hours – Review initial results of the model, provide feedback on drivers used.
Second Iteration of the model (First Minimum Viable Product) – 3-5 weeks
One dedicated lead analyst – Project Manager
One dedicated data analyst
Heads of Departments – 1 – 2 hours – Review final results of the model, provide feedback.
Senior Executives (CEO/CFO/COO) – 1-2 hours – Review final results of the model, provide feedback.
The following iterations and model updates should only take 3-4 weeks each time. The objective is to get feedback and update the model quickly, this means it’s also very important to prioritize suggestions for improvement. Focus on high priority / urgent suggestions first and then plan other suggestions and model updates over time.
Model Updates
It’s also important to determine the period of time between full updates of the model. To date the majority of our ABC models have been annual updates. However, these models could be updated twice a year, quarterly or even monthly. If you want quicker updates of the ABC model, then it is essential that you have very streamlined and automated processes for sourcing data and updating the model. It’s also dependent on how your business operates, as an example, it makes no sense to have monthly updates of our Higher Education models if revenue only comes into the organization on an annual or semester basis.
Higher Education Activity-Based Cost Models
If you are an institute of Higher Education we are in the planning stages of developing training to take you through the specifics on how to build a Higher Ed specific Activity-Based Cost Model which will include the data sources needed, specific data fields, key modelling rules to use and core drivers. If you are interested, please subscribe to our newsletter and we’ll let you know when it’s ready to go.
Hi Lea (et al). I love this. On the money. Pun intended. A costing model these days can be so easily set up, but needs all of these practical guides to shape best fit. Whether it’s a POS system or an org wide data build your article applies equally.
The one additional comment I would make is the ability to quickly apply context to your costing data through linked performance targets. What bang you’re getting for your buck. Expands the topic a tad, but you asked 😁
See you soon
Hi David, Great point about adding performance metrics – absolutely agree!
Cheers,
Lea