Data infrastructure must be efficient and scalable
Aware of the enormous potential of Big Data, more and more companies are looking for data data analytics solutions that can meet their current and future needs. Efficiency and scalability Efficiency and scalability are crucial in this regard.
Companies around the world are launching data-driven initiatives. In order for their analysts to extract value and valuable insights, they are investing in heavy cloud-related infrastructure. The hope is that the results will cover the hefty bills from Amazon and Azure for running these huge calculations in large data centres. The right choice of hardware infrastructure is crucial, as it has a major impact on analyst efficiency. Typically, analysts spend more than 80% of their time on simple data management tasks (e.g. searching, integration and data cleaning for example). This is known as the “80/20 data science dilemma”. The development of a user-friendly solution for rapid data transformation therefore plays a crucial role in the comfort, efficiency and productivity of every data analyst within an organisation.
Big Data infrastructure
One of the most common infrastructure mistakes is the use of Spark, Hadoop and Java – Big Data technologies typically associated with the cloud. This is not only very expensive – 99% of companies that use them pay more than half a million euros a year to Amazon or Azure – but also unfortunate. These technologies are unable to provide adequate scalability and acceleration. An important property of a good infrastructure is to offer more computing power when adding servers. As an example: if the number of servers is doubled, the computation time should be halved (in technical terms, you should get an acceleration equal to 2).
Breaking promises
What initially drove companies to adopt Spark, Hadoop or Java was the promise of almost unlimited acceleration via the addition of servers. However, it was recently discovered that the maximum acceleration for typical data management tasks is typically only 2, regardless of the number of servers – even if there are several thousand. This means that despite considerable investments, data analysts are still faced with endless processing and regular disruptions. Whose fault is this? The limitations in acceleration and the instability of Spark, Hadoop and Java.
Belgian innovation
Even if the Big Data technologies of Spark, Hadoop and Java are a failure and the huge investments made in their infrastructure are potentially futile, there is still a glimmer of hope for the Big Data world. And salvation even comes from Belgium! TIMi is an innovative Belgian solution that can evolve, unlike previous technologies. When adding servers, this solution easily reaches speeds of over 1,000. This is far more than Spark, Hadoop and Java, which are limited to an acceleration factor of 5 and usually barely able to increase the speed up to 2. The transition to TIMi alone will result in an average acceleration of 39. In addition, infrastructure costs will be reduced by a factor of 22.2. In addition, there is a 100% free community version. The other good news is that this system is so simple that with it, almost anyone with an analytical mind can quickly become a full-fledged data analyst.
Increasing awareness
Belgium is admittedly a small country, so data volumes are generally quite modest. Proximus, the largest telecommunications company, has two million users. This volume does not require a lot of computing power and yet, here too, companies complain about the slowness of technologies such as Hadoop, which forces them to invest a fortune in Amazon and Azure services. And even then, their data analysts still have to wait long hours for results that they would otherwise get in a matter of seconds. Some are already aware of this, especially the larger companies with larger volumes of data and therefore needing more computing power and speed. This is also the reason why large countries, including some in the third world, are currently more advanced than Belgium in terms of data infrastructure. And this is particularly unfortunate, especially when you consider that the solution to these challenges was designed in our country.