DATA IN MOTION
Anatella handles tables with billions of rows and thousands of columns even on small infrastructures.
Develop complex data transformations faster using an intuitive interface that requires no code.
Easily extract, clean, aggregate and join all kinds of datasets and inject them into any tool.
Create or import transformations in R, Python & JS using the plugin system and the collaborative framework.
Fast data management
Extract, join, clean, aggregate or join any datasets faster than ever and easily inject them into a RDBMS, a BI tool, a modeling tool or into R/Python.
Perform any data quality and data cleaning tasks on large data volumes. Anatella cleaning procedures are optimized to work on several billion rows.
R and Python integration
Anatella offers an abstraction layer around the code that allows analysts and coders to communicate smoothly and make progress in a collaborative way. Everybody can bring meaningful contributions to solve the analytical problem at hand, without any technological barriers.
Automatically updates for you all the charts and graphs of your MS Office reports at very high speed (and with a system that you can easily maintain and improve!).
Built for Machine Learning
Anatella includes strong Text Mining, Data mining and Graph Mining capabilities that can be extended thanks to the R/Python integration. Anatella also includes many functionalities tailored for predictive modeling such as feature engineering, meta-data-free transformations, etc.
Built For Iterative Work
BI, Analytics and Predictive Analytics projects are characterized by their “exploratory” nature. That’s why Anatella allows quick iteration over different variations of your data transformations.
Run Social Network Analytics (SNA) and Graph Mining algorithms on large graphs with several dozens millions of nodes and several billions of arcs (to compute Communities, Social Leaders, etc.).
Store vast amount of data with less disk space thanks to Anatella’s exclusive .gel and .cgel file formats.
Hadoop & Spark Integration
Easily integrates with Hadoop&Spark through a native, low-level C code to read/write .parquet files directly from/to a Local&HDFS drive. We suggest that you avoid using these technologies because they are very inefficient: more details here.
Inject practically any type of data inside your Data WareHouse (based on MS_SQLServer, Oracle, Teradata, etc.) and the other way around. Anatella fully supports even the most exotic database engines.
Deploy you data transformations on your production server/cluster in a few mouse-clicks. It runs on both Windows and Linux servers.
(IoT and Http)
Anatella works both in classical “batch mode”, but also in real-time streaming mode. Anatella makes direct connection to common IoT brokers straightforward and easy, while sustaining practically unlimited simultaneous connections.
& No Cloud required.
The dataset sizes that you can manipulate with Anatella are only limited by the size of your drive and not by your RAM. For 99% of the companies, Anatella’s engine is so efficient (both in terms of computation speed and in terms of data compression for storage) that one ordinary laptop is more than enough to handle all the tasks at hand (and thus no cloud is required!).
Anatella has a small wizard-based installation system that installs Anatella in less than a minute. There is also a portable version of Anatella that requires no installation and no system privileges. With the portable version of Anatella on a USB stick, you can run Anatella anywhere.
Low licensing cost
There are no licensing fee based on the volume of processed data. If your business grows, then your data volumetry also increases: With Anatella, you can still process all your data as often as you need (i.e. no “data tax”). Furthermore, the Community Edition of Anatella is 100% free and already covers many of the usual business cases.
“The optimal solution to extract advanced Social Network Algorithms metrics out of gigantic social data graphs.”
“We reduced by 10% the churn on the customer-segment with the highest churn rate.”
“TIMi framework includes a very flexible ETL tool that swiftly handles terabyte-size datasets on an ordinary desktop computer.”