- The desire for rapid decision making is increasing and the complexity of data sources is growing; business users want access to several new data sources, but in a way that is controlled and easily consumable.
- Organizations may understand the transformative potential of a big data initiative, but struggle to make the transition from the awareness of its importance to identifying a concrete use case for a pilot project.
- The big data ecosystem is crowded and confusing, and a lack of understanding of that ecosystem may cause a paralysis for organizations.
Our Advice
Critical Insight
- Big data is simply data. With technological advances, what was once considered big data is now more approachable for all organizations irrespective of size.
- The variety element is the key to unlocking big data value. Drill down into your specific use cases more effectively by focusing on what kind of data you should use.
- Big data is about deep analytics. Deep doesn’t mean difficult. Visualization of data, integrating new data, and understanding associations are ways to deepen your analytics.
Impact and Result
- Establish a foundational understanding of what big data entails and what the implications of its different elements are for your organization.
- Confirm your current maturity for taking on a big data initiative, and make considerations for core data management practices in the context of incorporating big data.
- Avoid boiling the ocean by pinpointing use cases by industry and functional unit, followed by identifying the most essential data sources and elements that will enable the initiative.
- Leverage a repeatable pilot project framework to build out a successful first initiative and implement future projects en-route to evolving a big data program.
Member Testimonials
After each Info-Tech experience, we ask our members to quantify the real-time savings, monetary impact, and project improvements our research helped them achieve. See our top member experiences for this blueprint and what our clients have to say.
7.0/10
Overall Impact
3
Average Days Saved
Client
Experience
Impact
$ Saved
Days Saved
Fond du Lac Band of Lake Superior Chippewa
Guided Implementation
7/10
N/A
3
RJRGLEANER Communications Group
Guided Implementation
10/10
$25,000
20
Team members on time, Provide useful insights and guidelines. No real bad experience. They have been very helpful.
The York Water Company
Guided Implementation
9/10
N/A
N/A
It's hard to estimate time and money savings, but it was a valuable process. We always have multiple large projects running out of our department, ... Read More
Workshop: Leverage Big Data by Starting Small
Workshops offer an easy way to accelerate your project. If you are unable to do the project yourself, and a Guided Implementation isn't enough, we offer low-cost delivery of our project workshops. We take you through every phase of your project and ensure that you have a roadmap in place to complete your project successfully.
Module 1: Undergo Big Data Education
The Purpose
- Understand the basic elements of big data and its relationship to traditional business intelligence.
Key Benefits Achieved
- Common, foundational knowledge of what big data entails.
Activities
Outputs
Determine which of the four Vs is most important to your organization.
- Relative importance of the four Vs from IT and business perspectives
Explore new data through a social lens.
Brainstorm new opportunities for enhancing current reporting assets with big data sources.
- High-level improvement ideas to report artifacts using new data sources
Module 2: Assess Your Big Data Readiness
The Purpose
- Establish an understanding of current maturity for taking on big data, as well as revisiting essential data management practices.
Key Benefits Achieved
- Concrete idea of current capabilities.
- Recommended actions for developing big data maturity.
Activities
Outputs
Determine your organization’s current big data maturity level.
- Established current state maturity
Plan for big data management.
- Foundational understanding of data management practices in the context of a big data initiative
Module 3: Pinpoint Your Killer Big Data Use Case
The Purpose
- Explore a plethora of potential use cases at the industry and business unit level, followed by using the variety element of big data to identify the highest value initiative(s) within your organization.
Key Benefits Achieved
- In-depth characterization of a pilot big data initiative that is thoroughly informed by the business context.
Activities
Outputs
Identify big data use cases at the industry and/or departmental levels.
- Potential big data use cases
Conduct big data brainstorming sessions in collaboration with business stakeholders to refine use cases.
- Potential initiatives rooted in the business context and identification of valuable data sources
Revisit the variety dimension framework to scope your big data initiative in further detail.
- Identification of specific data sources and data elements
Create an organizational 4-column data flow model with your big data sources/elements.
Evaluate data sources by considering business value and risk.
- Characterization of data sources/elements by value and risk
Perform a value-effort assessment to prioritize your initiatives.
- Prioritization of big data use cases
Module 4: Structure a Big Data Proof-of-Concept Project
The Purpose
- Put together the core components of the pilot project and set the stage for enterprise-wide support.
Key Benefits Achieved
- A repeatable framework for implementing subsequent big data initiatives.
Activities
Outputs
Construct a work breakdown structure for the pilot project.
- Comprehensive list of tasks for implementing the pilot project
Determine your project’s need for a data scientist.
- Decision on whether or not a data scientist is needed, and where data science capabilities will be sourced
Establish the staffing model for your pilot project.
- RACI chart for the project
Perform a detailed cost/benefit analysis.
- Big data pilot cost/benefit summary
Make architectural considerations for supporting the big data initiative.
- Customized, high-level architectural model that incorporates technologies that support big data