A Look Under the Hood at Abt’s Analytic Strategy: Part 1: The Challenge and the Vision
Several years ago, Abt Global had a problem that many organizations face as they grow and change organically. Over the previous decade, we had acquired two different companies, each with its own analytic data and tools. Thus the parent company and the new companies had project data stored in three different places, and we had multiple analytic tools. We had multiple copies of the same tools and in other cases tools were available to some staff members but not others. Moreover, within each company, some people used software on their local desktops, and other projects worked on one of several on-premise servers distinguished from standard shared drives based on the level of data sensitivity. Some projects used a new cloud-based environment primarily for highly sensitive data.
Imagine the chaos when different parts of the company needed to collaborate. Multiple copies of data proliferated. We would use our survey tools in one environment and do analysis on another. ArcGIS was only available in one place, SAS was available in two other places, and R could be used only on a desktop, and thus could not access sensitive data.
We had other challenges in our analytic platforms. We focused on delivering on our existing government contracts using traditional analytic tools, primarily SAS, SPSS, Stata and NVivo. We had only minimal capacity to use open source, modern data science tools such as R and Python. These are the tools that our new hires had been trained on, and we needed these tools to help them get their work done, attract and retain talent and offer more modern analytic services to our clients. Similarly, we were well equipped to deliver traditional analytic reports, but it was more of a challenge to share data visualizations.
We clearly needed a comprehensive analytic strategy. To address this, a group of senior analytic staff came together and developed one. The goals:
- Address the inefficiencies in our current analytic processes.
- Position ourselves for the data science work of the future.
- Improve our security posture.
- Reduce cost.
We needed to move from legacy, duplicative, on-premise, less secure and siloed platforms to a single, modern, secure and integrated platform. We developed a three-year analytic strategy, which not only encompassed our platform needs, but also looked at processes and people.
Conceptually, we divided our platform needs into two distinct categories.
- The Factory is the platform for our traditional research and analytic work. The factory must contain all of our tools and project data and be accessible to all users. It must allow for repeatable processes and be secure and compliant enough to house our clients’ most sensitive data. And it must support the entire “conveyor belt” of the project lifecycle, from data collection through analysis and dissemination. This represented approximately 90% of our needs.
- The Laboratory supports experimental work. Here is where we can build, deploy and test new tools and run ad hoc data science experiments that may or may not produce usable results. We needed to wall it off from the factory so that large experimental analyses will not impact the performance for everyone else. This was about 10% of our current analytic work, but a growing segment of our expected future.
Besides conceptualizing what we needed into the factory and the laboratory, another key decision was to consolidate all project analytic work into one platform that met the highest common denominator of security and compliance, even if a particular project did not need such tight security. Not only is this the right thing to do for our clients and the people whose data we are entrusted with, but this enabled us to deploy only one set of tools instead of duplicating them in multiple environments with different security controls. And it protects all data equally, even if particular clients did not require it. So the approach offers cost savings in addition to improved security.
After envisioning the platform of the future, the next step was to design the solution and plan the implementation. We will discuss these parts of the strategy in the next post.