Big Data is the new trend for analytics which makes me wonder, does it really matter if your data is big or little? The bigger question is, are you able to make informed decisions with the data you have? I often see companies with very little data, 10 million facts or less, struggling to meet the demands of business users who want and need answers to challenging questions within their business. Who are my top customers? Are we delivering as we promise? Do our vendors deliver as promised? How do we know if a campaign was effective? What are the leading indicators to identify potential financial challenges sooner rather the later? These are just a few of the many types of questions business have on an everyday basis. As businesses want to start measuring and monitoring web impressions, social media feeds, and other transactions that are being generated by the millions every second of every day, the analysis challenge becomes event greater given the sheer size of the data being collected. I contend that no matter the size of the data, if you are unable to easily access that data and ask questions that return meaningful and accurate answers, then the data is useless, big or small.
Data has been accessible since the first computer was developed, it was just a matter of how and who does that reporting. Today, business users no longer want to wait for the really smart data wizard to perform some magic behind the magic keyboard to provide them with a report, graph, or Excel export. They want to be able to do that magic on their own. That, in my mind, is the biggest challenge. The business user is given limited access or canned reports because the data wizards (IT) feels the user is unable to ask and answer their own questions without the help of IT. In some sense that is true given the way transactional systems have been built using the most cryptic of terminology and naming conventions. Who knew KUNNR meant Company Code in SAP? To overcome this, many have resorted to data warehousing and other tools to build out a semantic layer for corporate data. This is a good start, but it takes a lot of time, money and resources to build out and quite often, by the time the warehouse is built, the rules have changed and new questions are more important than the old questions.
I think the next generation of Data Discovery tools represents a significant move in the right direction. I used to work for a guy who had a very home spun message when somebody asked a question about how easy or how hard something was and his answer was, “There ain’t no magic.” The meaning to that is, even with the new and emerging data discovery and big data tools the effort to put self-service BI into the hands of the typical Business user still requires a well thought out strategy for data access, data governance, and data visualization. Self-service BI requires executive commitment and achieving success requires teamwork with both Business and IT working together to make data, big or small, useful, timely and accurate.
Wade Manis has 30 years of experience as a technology evangelist working for companies to include QBIC, Sverdrup, Computer Associates, SCT(now Infor) and nine years at QlikTech.
Currently Wade is a partner with Serviam. Serviam is a solution provider that helps clients navigate the challenges associated with building a self-service BI program. Serviam knows data and have serviced a diverse mix of industries to include, but not limited to: Retail, Apparel, Facility Management, Health Care, Manufacturing and Financial Services. We focus on building solutions that are scalable, accurate and meaningful. Serviam provides architecture, installation, development, application health checks and training services. We are experienced trainers that can deliver beyond the core curriculum by providing “real world” examples on how to build fundamentally sound BI solutions. Our mission is to serve you and to provide organizations the ability to build “one version of the truth” with Business Analytics.
Image courtesy of Watcharakun / FreeDigitalPhotos.net”.