Exploring IT budget models and the case for SAP on GCP
CBB models keep track of each IT service and its value. Should the value no longer be present, you can terminate the service and all associated costs immediately, says Sizwe Mabanga, customer engineer at Digicloud Africa.
This article is targeted at budget decision-makers and aims to present the idea that they need to hold their IT departments accountable for better IT consumption-based budgeting (CBB).
We speak to the clear advantages of CBB and associate it directly with cloud solutions.
The SAP on GCP section acts a solid example of such a switch in approach.
It ends with a call to action.
“The cost of not being able to go as fast as those who have adopted newer technologies manifests as a sometimes slow, sometimes blindingly fast bleeding of whatever competitive edge your business currently enjoys.”
This article is designed to show you that your IT budgeting model is key to unlocking your cloud computing strategy. So, in the first section, we explore budgeting and evaluation models for IT departments.
In the second section of this article, we bring the learnings to life as we inspect the total cost of ownership (TCO) of a SAP on Google Cloud Platform (GCP) solution as an example. We look at how SAP on GCP speaks to consumption-based budgeting models that maximise flexibility and deliver on value while decreasing the TCO.
Section 1: Exploring IT budget models
It turns out that the reason you want to have IT budget models in the first place is so you can control costs. Many organisations find it difficult to keep track of their burgeoning IT costs, let alone control them. This is because of the murky waters of cost centres vs profit centres or capital vs operational expenditure or exactly where to allocate the IT staff costs for projects, etc. This murkiness makes decisions about IT investments difficult because of hard-to-pin-down internal service costs. IT budget meetings have been known to become very, very tense.
So, I found two popular budgeting models.
Yearly rollover budget vs zero-based budget
Back when I was a Java developer in large corporations, I would see my managers, usually around January, all flustered about the yearly budget. They’d be running around the office collecting numbers, questioning people’s timesheets and figuring out how to spend the remainder of last year’s budget. From what I could gather, they needed these numbers because they would use them to go, hat in hand, to their budget overlords requesting a repeat of last year’s budget plus 10% for inflation, or something like that.
Let’s call that the Yearly Rollover Budget (YRB) Model and announce it as the official villain in this article. It is still widely used because it is an easy sell. This is especially the case if your budget overlords are not questioning the “value” of the IT services they are about to purchase for another year. Yet it is the villain because it offers neither flexibility for more agile cost management nor insights into the value that these costs are delivering.
When your budget overlords are questioning the value of your IT services and you do want some agility and insight, it might be time to whip out the old Zero-Based Budget (ZBB) Model.
ZBB models ask you to re-justify each IT expense before it is included into the next budget period. This helps leaders stay close to the real cost of their IT solutions and periodically trim off unnecessary costs. Furthermore, the exercise of cost enumeration with an eye to justification generates exactly the kind of insights that allow leaders to compare internal costs of various IT services against that of a third-party service provider (like cloud services).
While ZBB models are a far better alternative to YRB models at keeping everyone plugged in to costs and value, they require a fair bit of effort to get right with highly variable returns on said effort. Most IT departments do a heroic best effort ZBB, but with many costs, such as head counts and on-premises hardware operational costs simply rolling over un-inspected. It is only when a ZBB model is effectively applied from a genuine zero-base and the costs are well understood that the comparison with infrastructure as a service (IaaS), software as a service (SaaS) and platform as a service (PaaS) becomes meaningful.
And what that comparison often yields is the reminder that for most businesses, IT is not their core business. They are really accountants, retailers, underwriters and bankers. IT became a competency in their businesses because the industry was new and everybody was still feeling it out. Now, though, IT competency is being spearheaded by hyperscale cloud service providers such as Google Cloud Platform (GCP) and its competitors.
The standards of infrastructure, security and operational support are higher in cloud providers because IT actually is their core business. Cloud solutions make sense, even if they come in at the same price, because they free businesses from IT infrastructure concerns and allow them to concentrate on simply using IT to enable their actual core business.
Opportunity cost – the silent killer
Speed. Things are changing faster and faster and the rate of technological innovation is catching everyone off-guard. The most valuable asset you have as a technology-enabled business is the ability to match that rate of innovation. Fortunately for you, it seems that this is exactly what good cloud services are designed to do for you.
Good cloud services set you up with guardrails in the form of highly secure, on-demand, world-class IaaS, SaaS and PaaS services, that allow you to go really, really fast. The real promise of the cloud is that you will be able to try things and figure out whether they work for your business within weeks or even days, instead of the months we’re accustomed to from internal IT build projects.
The cost of not being able to go as fast as those who have adopted newer technologies is known as opportunity cost. Opportunity cost manifests as a sometimes slow, sometimes blindingly fast bleeding of whatever competitive edge your business might have had. Opportunity cost can be said to be the hidden slayer of such giants as Blackberry, which was far too slow to jump onto touchscreens, Blockbuster Video, which was too prideful to realise online content delivery is a real threat to their dominance and, more locally, remember Mxit, which was too slow to modernise their platform in order to compete with global platforms such as WhatsApp?
You know you are hitting opportunity cost when your IT department is telling you what they can’t do, or that they can do it, but it will take 12 months and capital expenditure.
Cloud service providers help reduce this opportunity cost to almost zero by introducing consumption-based billing where you only pay for what you use on a month-to-month basis or commit to a minimum capacity at a reduced rate on a yearly basis.
The Consumption-Based Budget (CBB) Model is basically the protagonist and all-round good guy of this article and is here to help you do far more interesting things with your IT competency.
Less certainty, more control – consumption-based budgeting
So, the first thing you need to do when you step into the new world of CBB models is to let go of certainty. At this stage, you see that perhaps the ability to put a number to your yearly budget at the beginning of the year was an illusion to begin with.
You simply cannot know in January exactly how much you’ll be paying for your cloud services in October. And you shouldn’t want to either, because you also know that, as a result of switching to a CBB model, between January and October you had the flexibility to go live with seven customer-facing projects, of which two are doing great and generating sales, but four have been scaled down to bare bones because of non-performance, and one has been completely shut off.
The point is that in 10 months, you tried seven things, of which two worked. This is very different from traditional enterprise IT budget models that put too much emphasis on going all-in with what looks like the best solution in January and discovering in October that it doesn’t work, but having no choice apart from continuing with it because you committed to it in January.
But with the lack of certainty comes ever greater control. Cloud service costs are controlled through a system of consumption monitors, account thresholds and budget alerts. And actually that’s all that a CBB model is: A set of monitors, thresholds and alerts that keep you from overspending, while fully empowering your IT team to innovate and move ever faster. Combined with the ability to shut-off consumption when the value is no longer clear, thresholds and budget alerts give the enterprise far more control over their IT costs than they have ever had. At the same time, it keeps the value that each service is bringing squarely in view throughout the year (not just at the yearly budget review).
CBB models keep track of each IT service and the value each brings and, should the value no longer be present, allow you to terminate the service and all associated costs immediately. If this isn’t the dizzying new height of budget control, then I don’t know what is.
Section 2: The case for SAP on GCP
If the previous section has still not piqued your interest, please join me as we dig into a case study that I know most enterprises and corporations will relate to: running SAP.
SAP remains a key enabler of many African enterprises with their wide-ranging, highly customisable and often business-critical software solutions. SAP has recently transformed it’s solution’s direction with SAP HANA which is a high-performance in-memory database that accelerates data-driven, real-time decision-making and actions. This is important to note because with this change, the key factor in choosing hardware is no longer the amount of processors, but the amount of RAM. For a decent sized corporation with robust SAP-supported operations, the production server may need up to 6 Terabytes of RAM.
The reason SAP has taken this direction is because they know that collecting and mining large amounts of data is the operational IT model of the future. The data will provide actionable insights and strategic-decision making support in as real-time a manner as possible.
From relying on rockstars to real-time insight
And, let’s be honest. We have followed a lot of decisions from people who sounded right at the time the decision was made. Leadership, for most of the 20th century, made decisions based on the confident “logical thinking” or sometimes “gut feel” of some corporate rockstar or leader.
Early in the 21st century, however, a new big player has made a confident debut in the decision-making matrix: real-time data insights.
Data that businesses accumulate, as they operate, is now understood to be an under-leveraged advantage that can be used to directly increase bottom lines. When businesses have real-time insights into their operations and how consumers are reacting to their offerings, they can make better timed and better targeted decisions or strategic shifts.
SAP, never ones to be caught flat-footed, have re-architected themselves and their solutions to be able to deliver real-time data-driven actionable insights. Say that three times fast. This re-architecting obviously included a move to the cloud and with that a very close relationship with Google Cloud.
When SAP is hosted in GCP, the ability to mine real-time insights is heightened by a seamless integration with GCP’s BigQuery. BigQuery is an industry leading PaaS data warehousing solution that is at the centre of a lot of Google’s data and machine learning efforts. It empowers data analysts to quickly import, analyse and mine data with a growing set of machine learning (ML) and artificial intelligence (AI) techniques.
The big value is the ease with which you’ll be able to exchange data between your SAP installation and BigQuery. This can be the technological advantage that differentiates your operational team from your competitor’s by ensuring the right data gets to the right operations people who need to act on it in the field.
And, yes, you’re right. Real-time data insights cannot and should not replace good leaders and managers. The idea is that the same leader who led on “gut feeling” will now have the data to create shared context, support tactical decision-making and go really, really fast.
The cost model comparison
“But how much does it cost compared to what I’m currently paying?” is not the question you should be asking. The question is: “How can I get the best, most real-time value from my IT solutions?” But if you insist on the cost comparison, please have a look at the following diagrams to ensure we know what kind of apples we’re comparing.
First, with the above diagram, we look at the idea that the cost of hosting SAP on GCP is incredibly flexible. With SAP on GCP, you can right-size your virtual machines to exactly what you need and realise cost savings immediately. This is again something that looks simple on the surface but has very powerful implications for your IT budget strategy.
The switch in IT infrastructure/licensing spend from self-hosted to cloud is also a switch from capital expenditure to operational expenditure. Paying monthly and avoiding large capital outlays keeps the operational budget agile and adaptable to emerging situations and business needs. The below diagram presents a direct comparison of capacity purchase strategies. It pits purchasing a static amount of capacity up front against purchasing capacity through a series of stacked committed use discounts (CUDs) in GCP. CUDs are delightful cost saving instruments that provide deeply discounted prices in exchange for your commitment to use a minimum level of resources for a specified term.
Immediately we see that the cloud strategy is far more cost-effective because you do not pay for capacity that you will not utilise until you're a couple of years into the solution lifetime. All data pools grow over time and hardly ever start out at 6 Terabytes.
You will also see in the diagram that by forecasting your minimum capacity requirements and using CUDs, you can drastically reduce the cost of your cloud services. If you forecast that in the first year of using SAP HANA, you will use no less than 1.5 Terabytes of RAM then you can create a commit for that 1.5TB, and fix costs around it at a reduced rate.
At the beginning of your second year, when you see how your minimum required capacity has grown, you can “stack” another commit on top of your existing one and pay a committed rate for 3.4TB in total.
Yet the most interesting thing for me happens at the beginning of year three. At year three, your expansion plan kicks in, your operational footprint grows, and your minimum data capacity needs double to 7TB. In GCP, this is achieved with a seamless move to newer infrastructure and adding another commit to your stack as opposed to being stuck with a 6TB upper limit on the ageing machinery that you purchased. In fact, GCP is an industry leader in scaling the amount of memory available to virtual machines and today you can purchase a virtual machine with up to 12TB of memory without ever talking to a salesperson.
The fact that you can do this, start up a 12TB virtual machine right now on GCP, is, in my opinion, Google Cloud’s haka at the competition, as shown in the adjacent image. The competition is far behind GCP’s hypervisor on this front.
And in case it is not clear, the hidden beauty in this flexible approach is that you will always have the ability to leverage the latest tech on offer, like 12TB VMs, rather than having to undergo major modernisation exercises every time your machinery reaches those upper limits or the end of its useful life. More control. Lower opportunity costs.
There are numerous other reasons why running SAP on GCP is a better option than hosting it yourself or on any other cloud for that matter, but I feel I have listed the most important ones above.
- Cloud IT strategies allow one to move to a consumption-based budget model which increases your operational flexibility while decreasing operational risk.
- GCP and its competitors are leading the way in IT innovation technology, which can improve on existing and unlock new revenue streams for businesses. GCP engineers are experts in their fields and are leveraging their core competencies to bring market-leading solutions to the marketplace. More and more often we are seeing a desire for companies to embrace technology as a differentiator (eg, we hear "we want to work like a tech company" from a bank).
- SAP HANA is an in-memory database, which adds a focus on ever larger RAM capacity in virtual machines and is designed to enable faster and faster operational insights. GCP has industry-leading, specialised VMs for this.
- SAP HANA + Google’s BigQuery is the easiest way to create an operational advantage using ML/AI.
- While the value argument for hosting SAP on GCP overshadows the cost argument, it still turns out to be a lot more cost-effective, with better ROI and far more flexibility from a budgeting perspective to host one’s SAP on GCP.
All of the above is presented with great respect to each corporation’s individual journey. If, though, you are ready to start transitioning to a different way of doing things then I suggest that your first step be to find your official Google Cloud Partner and begin a conversation on how to tailor your journey to your needs.
Alternatively, fill in this form and a representative of Digicloud Africa, which is Google’s Cloud enablement partner for Africa, will give you a call and help you find the Google Cloud Partner that is right for you.
Below are a couple of links that helped me write this article:
If you’ve made it this far, thank you so much for walking this learning journey with me. Stay safe.