Too Much Data, Too Little Time, Yet Too Important to Get Wrong: 5 Demonstrated Practices to Accelerate Your Analysis More Quickly to Actionable Insights
I work globally with executives to help them articulate strategic choices, develop robust plans, and take business-boosting actions by better meeting customer’s emerging needs or marketplace demands. Their companies rely upon the skillful analysis of business, competitive, customer, and market data and information to confidently propel their executives toward action. Nearly every organization I assist employs digitized information processes, solutions and/or systems that help it compress the time needed for executives to receive and act upon the results of data and information analysis; however, the information firehose and data flood often overwhelms their organization’s analysts, causes paralysis by analysis, slows the time to make decisions, or adds minimal additional confidence to the organization’s leaders in terms of reducing complexity, uncertainty, risk and/or doubt (aka, CURD). Are there demonstrated organizational principles or practices which can decrease this CURD tide from overwhelming us in our timely quests for better insights? In my experience and research, I’ve seen many beneficial practices effectively demonstrated and share five of the best ones herein.
The best information systems and solutions save time and effort by helping analysts effectively and efficiently separate signal from noise. Noise comes from having too much, bad, false, misleading and/or the wrong data. For organizations who gather data and information via primary, secondary and social sources (and who doesn’t?), noise can reach overwhelming levels (e.g., 98-99%). This drowns out the signal that the organization needs to observe and understand. Rapid learning about how solutions or systems can improve the signal to noise ratio is paramount to performing useful analysis. And from what I’ve observed, the answer isn’t just “turning down the flows” – though many organizations attempt this, but rather turning down the flows of noise while simultaneously amplifying the signal. Achieving this nearly always requires enhanced filtering, stronger taxonomies and the appropriate real-world testing.
Second, executives’ intense desire to quickly identify potential options or solutions often causes additional problems. The aim of compressing the analysis cycle is desirable, because windows of market opportunity may be short-lived. Nevertheless, any processes which compress decisional time incur trade-offs among solution quality, confidence and uncertainty levels, and costs. For example, results can be generated more quickly if executives devote more resources to generating options, can accept lower confidence levels, or value fast action as opposed to their opposites. Most analysts and their executives fail to calculate the impacts of these trade-offs up front, and would benefit by negotiating and agreeing upon these before embarking on an analysis project, thus saving time over the duration of the project.
Third, compressing decision-making or action-taking time produces better results when requesting executives precisely know what they want to use the results for and by when. Too many analytical exercises are akin to fishing expeditions, where badly defined problems seek solutions, or badly developed solutions need to find problems, which waste time. Effective systems that produce better deliverables nearly always begin by asking better, deeper and/or more discriminating questions. Many times, those questions will make executives uncomfortable or fearful, but that doesn’t make them any less vital. The best analysts use their information systems to generate more discerning questions that push the analysis in directions that produce more useful or actionable results. Are your systems or solutions driven by smarter questions?
Fourth, most systems rely on the quality and timeliness of their data and informational inputs – everybody recognizes the “garbage in, garbage out” or GIGO phenomena. Nevertheless, most of the organizations I observe collect data as though it were an endangered species, capturing any and everything with the assumption that some of it may be important in time to help us better understand a key decision or planning target. This has led to a decade plus long growth in data lakes, pools and repositories that are bursting at the seams with raw data. What’s the problem with this? The biggest issue in my clients’ worlds is that the raw data/info have a limited and decreasing shelf life, much like fresh produce or milk — if you don’t consume it quickly, it spoils! Much data collected today may be inaccurate, invalid or incorrect tomorrow or shortly thereafter. If you don’t believe me, observe finicky customers who want a product one day and change their minds after learning online about new features or rival options. Many organizations fail to tag their data with timeliness or “good by” dates. This causes them to employ stale data in their calculations and not recognize that the analysis output is tainted.
Finally, many organizations are too speedy to jump into a shiny new, the “next best thing” or seemingly better information solutions, wanting to maximize their use before the organization’s employees are prepared to properly use them. It may take considerable time to make and execute important solution or system decisions. The more you will rely on them, the more time you ought to spend in analyzing your options. And the more the systems that require human intervention, the more time you should allocate to making sure human-system interactions work the way they are supposed to. The old saying that “speed kills” can be a valuable way to enter in and win a market, but when it comes to making choices about information systems and solutions and how they can best be used, understanding the variable impacts of time can save careers!