Cloud computing has, undeniably, radically transformed how we deal with information. From cold storage and backup to daily operations of office work, many of us depend on large data centers and the promise of a controlled, low monthly cost, and environmentally friendly systems. Also, offering a fair “pay-per-use” pricing model that only charges us when we need a feature. Fair being a very relative term.
For the last few months, we have been observing some worrying trends. Firstly, the war between Ukraine and Russia is putting a strain on energy availability and cost for potentially a long number of years. Additionally, the rising tensions between China and Taiwan are putting availability of microchips in jeopardy. Thirdly, a rising concern about global warming and water consumption might put stress on cloud’s cost structure.
Using the cloud for cold storage is a convenient and efficient solution, and we must be ready to see a slight increase in price, and possibly see more limitations in the “free” offerings. With this in mind, we may need to delete some of the 150 Gb of videos our family sent us on WhatsApp, clean up photos, videos, and delete old email. Or simply be ready to pay the price for the peace of mind of not worrying about keeping hard disk backups everywhere.
A dirty little secret
Data Centers are incredibly energy hungry. Undeniably, gas and electricity are skyrocketing in Europe, reaching new historic high every month. Data Centers require a HUGE amount of energy to operate. Surprisingly, finding exact information on the topic is not easy. It’s like this is a dirty little secret that nobody really wants to talk about. Fortunately, we can make some inference:
- We have found that One Microsoft data center in the Netherlands used over 100 MILLIONS liter of water for cooling on a cold year! (https://www.clubic.com/pro/entreprises/microsoft/actualite-434481-en-pleine-penurie-d-eau-les-pays-bas-en-decouvrent-la-consommation-dantesque-des-data-centers-de-microsoft.html)
- Other sources reveal that in Ireland, Microsoft and Amazon alone are using pretty much all the wind powered energy of the island. And is seems their expansion plans CANNOT BE COVERED by the current supply and could lead them to leave (https://www.zdnet.com/article/power-shortages-threat-aws-and-microsofts-eu-data-center-expansion/)
- The total consumption of data centers is around 1-4% of the total energy production worldwide, depending on the estimate. And this keeps rising.
As a reference, a desktop computer uses around 600 kWh per year (screen, printer and speakers included), around 4% of an average US household consumption. (https://www.eia.gov/tools/faqs/faq.php?id=97&t=3 and https://www.energuide.be/en/questions-answers/how-much-power-does-a-computer-use-and-how-much-co2-does-that-represent/54/). Knowing that household consumption is a tiny 10.9 % of the total energy (https://ourworldindata.org/emissions-by-sector), you should not worry too much about your PC.
In some European countries, the cost of electricity has already risen by a factor of 400%. For this reason, cloud price could easily double just for providers to keep their current margins intact. OVH, for example, has already announced a 10% price increase to face this cost. If this corresponds to direct compensation, energy cost would be around 20% of their operating costs.
The heavy cost of cloud analytics
The real problem comes from heavy cloud usage, for analytics and data processing. In this nice article (https://arstechnica.com/information-technology/2022/08/no-code-wrapped-our-ml-experiment-concludes-but-did-the-machine-win/) we see that creating a very simple model can cost up to 1.300 USD using the current autoML solutions (and take HOURS to deliver results). And all of a sudden we understand how the cloud is a huge percentage of Amazon’s margin! If you have a team of data scientists working on fairly large amounts of data, the cost can easily skyrocket. Should the cost of accessing your data increase by 20 or 40%, are you ready? What if it doubles, or more? Why is your contingency plan? The latest estimate from Gartner is that 80% of the data centers will have migrated to the cloud by 2025. In short, can we really afford to leave processing as a variable cost under those uncertain conditions?
While this “pay-per-use” system makes sense for organizations that require one or two models per year on a small amount or records, many organizations need to process data daily, mingle with it, transform it, and run dozens or even hundreds of predictive models. And this requires a different approach. One in which costs are controlled, predictable, ideally fixed.
Those organizations have already realized that those operations are quite expensive. Many have come to accept that it’s the price to pay for all the advantages perceived in the solution. but is it?
The TIMi Solution
here is no “one solution” for everything. Many parts of your operations need distributed computing, we don’t pretend to replace an entire ecosystem. What we offer is a way to control those costs, and boost the output of your analytics activities.
Indeed, for data processing and predictive analytics, TIMi offers a simple and viable solution:
Process billions of records on a single server, with no associated variable costs, and FASTER than any other existing solution. Anything except videos and images can be processed in a very cost efficient way, and increased speed!
Build predictive models with high precision in a single click and get results in minutes, on a single server, without variable costs either.
As we just saw, these two things might just be the #1 amount invoiced by your cloud provider.
With TIMi, you only pay a license per server and you can save hundreds of thousands of euros every year. Minimizing the environmental impact of your data science team is a nice cherry on the cake. Download it, try it, and you will see data analytics under a new perspective: light, efficient, and ROI oriented!