Subscribe

White paper: Inside Kinetica


Johannesburg, 06 Dec 2019

For years, the CPU clock speed race trumped almost every other IT metric and thus many applications were optimised around single CPU architectures. Big data applications were benchmarked by the volume, velocity and variety of data they processed, almost entirely ignoring the complexity of data analysis involved and the unpredictable nature of data: long-lived vs perishable, human vs machine, and structured vs unstructured, among others.

Today’s big data applications are getting even bigger, and are being challenged with an extreme data problem, with growing unpredictable data and the increasing complexity of analysis that brings. The challenge is that delivering real, actionable insights is rapidly becoming every company’s most valuable resource, and therefore, businesses need to act with unprecedented agility to grasp the bigger picture.

Even though traditional systems are common, they are not well designed for handling large-scale analytical and streaming workloads such as those found in modern data-warehousing, decision support and business intelligence applications. Kinetica is a GPU-powered insight engine for the extreme data economy that is best for running large-scale analytics and streaming workloads. Whether you are a data scientist or architect, this white paper is for you. It goes deep into the internal architecture of Kinetica. It assumes you have a basic understanding of Kinetica and want to explore what’s under the hood.

Share