Data Matters

Data is everywhere. Everything is being measured, scoped and monitored providing you with endless piles of numbers and reports — but where is the value? Hidden in that data are the insights required to make better decisions: the a-ha moments that bring you higher profits, happier customers and more consistent results.  Data matters.  But what you do with it matters even more.  

When Netflix recommends a movie or television show you might like, when Google predicts the optimal time to purchase an airline ticket or when Walmart tailors offerings to each individual online customers, these are all examples of finding value in data.

The Big Problems with Data

Before you can start finding value in data, you need to overcome a few “big” data issues.

There is a lot of it.

You have probably heard the term “big data”. The most common interpretation is that there is a LOT of data. It could be decades of statistics about weather, sports teams or world markets. It could be spreadsheets and unstructured files sitting on your company’s computers. It could be real-time sensor readings along your supply chains or manufacturing processes. The problem is not just the size of data, but that it is everywhere as well.

The big problem with big data goes beyond how much you have — it’s how to keep it structured, validated and organized so your employees can access it and extract value from it; now and in the future.

There is more of it every day.

You may have also heard of the Internet of Things (IoT). It seems like every day, a new device is being called “smart” because it measures, reacts and reports on aspects of its operation. Smart thermostats? Yup. Internet-enabled refrigerators? Already here. This means that for all the data you have now, you should expect to have more. Lots more. And that’s a good thing… as long as you can keep up with it by moving your analytics closer to the edge.

Another big problem with big data is that you need to be able to process an ever-increasing flow at an ever-increasing rate – the more timely your response the more value you can extract.

It takes many different forms.

What do you picture when you think of data? You probably start by picturing tables, spreadsheets, log files and databases. These are the data sources that we have historically been comfortable working with, but today we have vast amounts of data that don’t neatly present themselves as columns of numbers – things like images, videos, sensors, articles, websites and social media posts. How do you process this data?

The third problem with big data is that most of it is unstructured. They say a picture is worth a thousand words, so if you can harness the information in pictures and sounds and words, there’s even more value to be extracted.

And you need to be able to trust it.

Did the sensors measuring things account for the glob of glub floating past them? Did the data connection go offline for a few hours thanks to a power outage or Internet hacker? And how much do you trust that summary from your data partner? Skips and blips happen, so when you go to look at all of this data you’ve collected, you need to ask yourself — how much can you rely on it?

Perhaps the biggest problem with data is its reliability. There’s an old expression in data analysis – “garbage in, garbage out”. In order to extract the best possible value from your data, you need to make sure it’s as meaningful as possible.

Data scientists often refer to these issues as “the 4 V’s of Big Data” – Volume, Velocity, Variety and Veracity. But ultimately, what they are looking for is the fifth ‘V’ – Value. Looking for Value in data may sometimes feel like looking for a needle in a haystack. But in reality, it’s closer to looking for several different types of needles in a farm where hay is constantly being harvested and gathered into stacks, bales and rolls. Plus there may be some nails and old toothpicks mixed in.

Different people extract value in different ways. Everybody is interested in data, not so much in the raw numbers but in how it can inform their decisions. The trick is to make the right data available to the right people with the right tools in time to allow them to act. Let’s look at some of these people.

The Data Scientist

Data scientists are the data professionals – highly educated in the art of math, statistics, advanced analytics and machine learning. They are typically allocated budgets for long-term projects to solve large-scale well-defined business challenges. Unfortunately, there are very few of them and they are not the business domain experts who actually make the front-line business decisions. Their tools need to be powerful enough to build any algorithm and execute it at scale in a production environment.

The Citizen Data Scientist

With the rise of easy-to-use Business Intelligence (BI) tools a new type of user was born — the “Citizen Data Scientist”.  These are the people with domain knowledge who can now slice and dice and visualize the data themselves. They have become “data” specialists and often take on this role officially in their company. Their tools need to be able to visualize and analyze data quickly without having to wait for data or reports to be prepared by programmers or data scientists.

The Business Domain Expert

The business domain expert knows far more about the business or science of their industry than how to manipulate data. Historically, spreadsheets have been the analytics tool for the masses. The complexity of the analytics and the bigness of the data, however, have left them looking for new tools.  Their new tools need to deliver deeper analysis on bigger datasets, while still being as intuitive and responsive as a spreadsheet.

Gartner predicts that the role of the Citizen Data Scientist will grow 5x faster than its highly trained counterparts (the Data Scientist).

Businesses have all of these types of users and they all need to be more data-driven. Ultimately, all of these people need to work together — to collaborate to find new insights and leverage each other’s strengths.  

We have identified the big issues with data and the requirements of the people who need to leverage it, but let’s go back to the fifth ‘V’ – Value. Value is the reason your data is so important, allowing the people at every level of your organization to make better, more informed decisions.  And with the advent of cloud computing, dynamic clusters of computing resources are now capable of processing data at any scale. Let’s look at some of the ways big data can be turned into value.

Data Visualization

A great place to start extracting value is by visualizing your data. Modern BI tools are responsive and let you explore wherever your questions lead you. Start with an overview of all of your data and drill down to any specific analysis. Query any information in your data and visualize it to gain insights.

Machine Learning

Machine learning gives you the ability to model the processes behind your data, providing the deepest insights. With the latest advances in machine learning, you can detect anomalies in your data, cluster it into similar groups, classify new data based on previous observations, simulate causes and effects, predict outcomes, create recommendations and even optimize your processes for the best performance.

Data Analytics

Sometimes the insights are in the data, but simple data queries and visualizations are not able to uncover them. More advanced analytics like statistical, Fourier or cluster analysis may be required for you to find the value. Modern tools make it possible for advanced analytic visualizations to be just as responsive and easy-to-use as BI tools.


Big data visualization, analytics, modeling, decisions — all of these are interrelated and the path between them is rarely linear. New data and insights lead to new visualizations and models in a continuous process of greater discovery.  This means that every aspect of the value extraction process — from data acquisition and preparation through visualization and analytics to live application delivery — should be viewed as inseparable. This process is iterative and there are many participants. The environment where you collect, validate and clean your data should be where you visualize, experiment, explore and ultimately tell the story of your data through powerful data-driven applications.

Meet nD

nD was created as a solution for the big data problems. The ability to take unbelievable amounts of free-flowing data and let you organize it, validate it, visualize it, learn from it and ultimately make better decisions based on it. It is the ability for anyone, from the CEO to the people on the front lines, to interact with the data that affects them, to ensure things are working as expected and try out new ideas when they happen. This is what makes the data valuable. The ability to collaborate, not just with people down the hall, but with people anywhere in the world that you want to share data and insights with, all within the same platform.

Over the next few weeks, we’re going to share some more blog posts with you to dive deep behind the scenes and tell you more about the technology behind the powerful new platform with articles focusing on:

  • Visualization & Advanced Analytics
  • Modeling & Machine Learning
  • Collaboration & Deployment in the Cloud

Data Visualization & Advanced Analytics

Save

Leave a Reply

Your email address will not be published. Required fields are marked *