How CIOs Can Avoid Six Big Data Fails
06/20/2017 by Jaime D’Agord Modernization - Analytics
Budgets for analytics platforms are on the rise, and every company is looking to gain an advantage with their data. Forbes estimates that more than one-third of enterprises plan to increase their budgets for data driven initiatives in 2015 while a majority of the rest said they would at least stay the same. With all of the pressure to gain that advantage before competitors do, it becomes easy for things to be misunderstood and handled poorly especially with how fresh the analytics platforms are to the IT industry.
Here are six of the crucial failures that prevent organizations from having a successful analytic platform:
It’s easy to see a product via sales pitch or demonstration and get excited about the possibilities. This is where key stakeholders need to take a good look throughout their organization and see what is best for the company before buying in.
Thorough planning and analysis must be done by business and IT leaders to determine what tools best suit the company’s needs. It is true that Hadoop, Tableau, etc. have incredible capabilities, but you might not necessarily need those tools depending on what you are trying to do. A lot of data discovery can still be accomplished by using traditional tools and better data management.
You should also consider what resources are available to implement the solution as desired. If those skill sets can’t be found or used, then extra hiring or contracting will be necessary. When this happens, further analysis is needed to find a reliable firm to assist with your big data needs. At Zencos, we were able to consult and support Peak Health Solutions in building a SAS-based analytics application to track Medicare Plan performance.
Administrators, developers, and analysts are all going to be using the tools. As much as a sales team explains how easy it is to implement and use, there is always going to be a learning curve for a new technology.
Here it is imperative that IT and the respective business department be on board for a successful big data project. Directors and managers on the business and IT sides have to understand the time and effort it will take their employees to implement and acclimate to the new technology.
Getting the correct training and support for the new big data tools is crucial to success as well. No matter how clean and great a tool looks on the front end, it can still be a nightmare to develop behind the scenes.
Updating systems, replacing old ones, and bringing in outside data sources are all ways that data can arrive in a data warehouse. In addition, each system or source had someone handling that data in his or her own way. When all of these parts combine, they can lead to numerous problems if things aren’t mapped out correctly from the beginning.
Many projects fail because people don’t realize how much time is needed to have a clear understanding of what needs to be done at the lowest level with the data in order for it all to fit together. When you bring in new sources, a majority of the time they are unstructured. From there, those sources will have to go through a process of transformation to fit into the data model. Depending on the current model and amount of new sources, this can end up being a time consuming task. In another post from Zencos, we cover the important aspects of building an effective ETL process.
Many smaller organizations begin their analysis and reports using tools like Microsoft™ Excel or PowerPoint. Over time the company grows but fails to upgrade the tools needed to move to the next level. BI companies have recently invested heavily into building new data discovery tools like SAS Visual Analytics. These tools support everything from standard reporting to advanced analytics. These capabilities are what can give an organization a deeper look at their data.
Several advantages come with these tools. For one, data is centralized, which allows for easier access. When you use Excel, data starts to sit in silos between the various analysts and managers. Then when it comes time to combine those sources for a report, the data can be painful to merge since there was nothing ensuring consistency of the data. Formats, freshness, abbreviations, etc. can all cause problems at this step. Another benefit to a centralized approach is the ability to have a batch process run to keep all of your data refreshed and consistent from its sources on a nightly or hourly basis.
These new tools are also Web-based solutions offering better graphical capabilities compared to traditional MS office products. For example, SAS Visual Analytics has geographic objects built in that can give you a mapped view of any data point as long as you have corresponding location information (lat/long or address) for each record. You can also create various controls and interactions for each report that allow users to filter the data instantaneously.
By making the jump to a data discovery tool, organizations are able to access their reporting from anywhere, have current data, and use better visual representations of their company’s data all from one tool.
If you have the budget and resources, the logical assumption would be to implement a solution that covers every department and the enterprise as a whole, right? You can get all of your data sources together at once and then have everything at your disposal for analysis.
A better solution is to start small and grow. Ben Zenick, COO of Zencos, explains that employee training and time to market are important aspects of an implementation that usually do not go smoothly when trying to start on a large scale. With a typical waterfall approach, employees will not get to see and test the product until late in the project. From there, it will take them a while to get acclimated to the new technology. In addition, starting big requires more development time, which leaves a large time gap in getting feedback from the end users. This can cause problems since the end users will be less engaged and the knowledge transfer will suffer.
By going with an agile methodology and starting on a small scale, you can get a solution up and running for your employees to begin using much more quickly than if you tried to build an enterprise-wide solution right away.
CIOs that implement solutions sometimes fail to remember that someone will need to keep the system up and running securely as time goes on. Data breaches have become quite a problem recently. It is estimated that each data breach costs on average $5.4 million or $188 per record stolen. As valuable as the data is that we all continue to collect, it must be remembered that it is also added liability to a corporation if it were to be hacked. Similar to other content sharing and database tools, analytics systems need that layer of security and administration to keep access in order and prevent conflicts.
A system administrator who understands how to secure the data across the network is invaluable to large and small organizations. You do not want to grant your end business users the ability to have access to the source data before it is transformed for their use. This can create more doors into your source data and could end up giving them access to data that should be private. Also, access to any Web-based reporting must be locked down. Competitors could quickly learn a lot about your company if they found access to the cleansed data used for reporting. Little things like these are easy to overlook, but can be very detrimental to a company if not handled appropriately.
Zencos is experienced in all phases and parts of a analytics program implementation. Whether it is assisting in the up-front analysis, implementation, or administration, we can help. Contact us. …..
We help you set the right course, develop the right solution, and transfer the right knowledge. Put our experts to work on your next Analytics project.