Subscribe to DSC Newsletter

Interview with Drew Rockwell, CEO of Lavastorm

1. Short Bio

I started my career in the communications industry, where I spent 20 years with a Tier 1 carrier in probably 15 different jobs across the entire organization: Marketing, Advertising, Product Management, Operations, Sales, General Management, Strategy and Business Development. I basically experienced a multi-billion business from many different functional areas, at increasingly responsible management levels. When I was in my early 40s, I decided to embark on a “second” career, to take what I had learned and to try to animate and build companies, which ultimately led me to MDS Lavastorm Analytics, where I am CEO today.

2. How and when did you become interested in analytics?

I think in all the various jobs I had, I was left a little cold by “reports”, which later in my career became visually more appealing dashboards, but in the end seemed more as ways to describe a certain situation, or a function, or a customer segment, etc. They were and are necessary, and sometimes they were cool to look at, but for me not sufficient.
Analytics to me are less focused at describing a situation and more focused on understanding “why” something is happening, so that I could do something about it, or simulating or predicting what might happen, so I could plan for it.
I was always interested in connecting things, in understanding relationships between things, and I think that was what made me gravitate to analytics as a career.

3. Do you have any predictions for the coming year or two in the field of analytics?

The field of analytics is so dynamic with technology changes increased investment. There is a great deal of change going on and a great deal of opportunity in front of us. We posed that exact question to the Lavastorm Analytics LinkedIn community, an online community we manage and we got a tremendous list of predictions for the field. Personally, I see a few themes gaining more traction in the coming 24 months:

  • Analytic power will continue to become much more decentralized, moving from IT organizations to business users, moving from the exclusive domain of highly technical people to less technical users, moving from dependency on large data warehouses to a variety of data sources, tools, and methodologies to get to insight and action quickly. One data point: 40% of analytics budget spend will move to business departments in the next 3 years (Gartner)
  • Analytic methodologies are becoming more discovery-driven and less dependent on the crafting of a question or a query. With the proliferation of “big data” there will be a need for more agile ways to test hypotheses, to join disparate data, both structured and unstructured together, and to more easily construct analytics. We have made huge strides in optimizing the processing of data, but we will see in the next few years huge strides in optimizing the analytic process itself, which I think will create a new wave of insight and action. The key here is to be able to gain analytic insight from within a business process itself to add context to the data you are analysing.
  • At the same time as there is growth in the profession of analytics, and the continued emergence of data “scientists,” this specialized knowledge will create more powerful software assets to extend that knowledge to a much broader group of analytic “consumers” who will be focused on capturing value, on the answers not the methodology.

4. How do you see analytic models in the era of big data?

It seems to me there is a need for analytic models to become less and less “rigid” and more and more “adaptive”. For example, as you inspect data at a detailed level and wonder about new questions that you want to understand, the “cost” in terms of time necessary to pursue those new questions or models should be virtually free. This is something I think Lavastorm does very well. In addition, the nature of ‘audit analytics’ is changing and moving much closer from single source data requirements to multiple source and also with a much greater degree of focus on auditing of the business process itself. This will help finance departments turn audit from a cost centre to a money making function.

5. How do we get data silos, internal and external sources, to blend together?

I view this as one of the key enablers of true analytics. In general, BI technologies have failed to make the bringing together of disparate data easy enough and they haven’t been able to create an analytic connective layer without having to put everything in a data warehouse. At Lavastorm, we have focused a lot of engineering talent on simplifying the joining of disparate data while maintaining the traceability of any data used in the analysis back to its original sources. This “traceability” builds confidence in the results and can be applied to a new generation of audit analytics.

6. What do you think of real time analytics?

To my thinking, true insight that yields action trumps all. But I do think we will see more timely analytics, whether it is real time or near-real time will depend on the analytics and the value of timely action. The important point is that the analytics are reflecting current conditions. For example, the Lavastorm Analytics Platform gives organizations the ability to push the analytic closer to the source of data creation because the data doesn’t have to go through a data warehouse and, therefore, the data is closer to the business process itself. That allows for faster detection, faster reaction, and greater control.
I have been intrigued for years by using analytics to correct mistakes as they are happening. I think there are some interesting examples of this, but I expect there is much more to come.

7. MDS Lavastorm talks about “controls” in the context of analytics? Can you shed some light on that?

Yes, from our work in Fraud Analytics and Revenue Assurance, we have come to believe that there is value in running persistent analytics over business processes, to continuously identify data that do not conform to rules.
A simple example of this is order accuracy – we run analytics for companies that inspect an order against a number of highly conditional business rules or logic, checking to be sure that things like promotional codes are correct, discounts are correct, addresses match, etc. We basically call out errors very soon after they happen, allow the business to fix them before they cause downstream issues, and to understand the root cause of the error so that the business process can be fixed quickly. This is a good example of a control – once you have captured the correct business rules (what is supposed to happen) there is enormous value in continuously monitoring a process to be sure the rules are followed. Finding the percent that is wrong, correcting it, understanding why, and correcting that, has enormous value. A key principle that we built into the Lavastorm Analytics Platform is the ability to easily create business controls, store them in a library and reuse them.

8. How has big data changed the way we use analytics?

Well it has obviously created the need for more and more powerful appliances and techniques for processing huge volumes of data, as well as dealing with the complexity that this brings. It has also created the need to cost effectively and quickly join together multiple sources of data and data types. It is creating a greater need to do “discovery” based analytics rather than pure query or model-based analytics.

Views: 647

Comment

You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

Follow Us

On Data Science Central

On DataViz

On Hadoop

© 2017   AnalyticBridge.com is a subsidiary and dedicated channel of Data Science Central LLC   Powered by

Badges  |  Report an Issue  |  Terms of Service