Yes it can, if you’re careful.
Cloud computing technology is already ubiquitous, with 81% of American organizations using cloud computing in 2020, up 9% in just two years. Companies already in the cloud continue to invest more in their cloud services, with the manufacturing sector estimated to be among the prime adopters of the next 5-10 years.
Cost savings is one of the highest-priority goals of companies deciding to adopt cloud technology. However, one of the top challenges reported by companies using cloud services is controlling cloud costs. How are reduced costs both a lure for new adopters and a pain point for veteran cloud users? How should you think about the costs of operating in the Cloud?
Cloud computing is like the “gig economy” of IT. Just like how Uber wants extra drivers working the 8am commuter rush, your IT department wants extra servers running when your research department starts crunching numbers for their latest idea. Once the 10am lull in traffic arrives, Uber doesn’t need as many drivers for the daily commute and can direct them towards airports and transit hubs. Similarly, once your simulations have finished running, IT doesn’t need those servers anymore, and they especially don’t want to be stuck baby-sitting them through future upgrades.
The cloud turns servers into gig economy workers; you pay for them when you need them. There is a simple balance between up-front hardware costs and minute-by-minute cloud computing charges. With some information about your server usage, customer base, data constraints, and peak requirements, you can even use some equations from the Industrial and Management Engineering department at IIT to determine if cloud adoption can save you money.
While only paying for the computing resources you need when you need them sounds nice, efficiently anticipating and responding to resource demands can be a real challenge. A barista will leave when their shift is over, but a server won’t just turn off when you don’t need it. Reacting to fluctuating computation requirements is essential to avoiding unnecessary server charges, and cloud users don’t always have the best reaction times. In 2020, cloud users reported at least 30% of their cloud computing expenditures went to waste or idle servers.
Even with some wasted server time, cloud expenditures and in-house server costs aren’t the only cost comparison you should be making. You may have to think about your server usage and security differently. You may have to work with some new technologies or even restructure your computation patterns to take full advantage of your “gig” servers.
But right now, your best bet is in the Cloud.
Cloud providers all want to sell us their computing power and they know we won’t pay for it unless we can manage its usage and security at an acceptably low price. Cloud providers expected much faster growth in their early days, and they have had to respond to user’s concerns. Cloud management and security is improving every day.
The net result is that with cloud adoption today, over half of cloud adopters (52%) experience better security, and the vast majority (87%), experience business acceleration from market expansion, to employee productivity, to improved time to market.
All large-scale IT changes come with a set of pros and cons. For cloud computing, the scales balance out in favor of adoption.