Unlocking the true value of data
Choosing the right project delivery approach is key for big data project success.
Have you ever wondered why it seems so difficult for companies to successfully adopt big data technology? This applies to any new technology that a company wants to benefit from.
While there are a variety of things that can slow down technology adoption, there is one that stands out for me. Companies sometimes don’t know how to decide on an appropriate project management/implementation model to suit the new technology.
It’s possible for a company to have enough budget, access to skills, the correct drivers for the project, and yet, still fail at implementing a big data project.
Big data refers to large amounts of fast-moving, unstructured, semi-structured data that can be mined to derive value. The value of big data can be realised by engineering the data in a way that it is easily accessible and usable to internal business and external users, protected and secure, and in some cases monetising it.
In this day and age, data is known as the ‘new oil’. With that in mind, failure to implement a big data strategy or big data “analytics” project prevents a company from unlocking the true value of their data.
As data specialists, we often hear the terms “I want big data”, “I want machine learning” or “I want artificial intelligence” from our clients. These statements are usually very disconnected from the actual business need, objective or want.
We often need to carefully unpack the requirements without discouraging or derailing the client from their vision. It’s imperative that the vision aligns to the company strategy otherwise the decision-makers will never be in support of any of the new tools or technology initiatives that one is trying to implement.
The decision-maker’s buy-in enables access to budget, infrastructure and the appropriate skills. The rest is left to the organisation to ensure the correct drivers are tasked with the responsibility to execute these projects to completion, successfully.
Some of the known reasons for failure of big data projects include a lack of clarity on the solution and the lengthy turnaround time to realise proof of value or return on investment (ROI). These two aspects correlate to lengthy delivery timelines which creates scepticism. As such, choosing the correct delivery approach is a big contributor to the success or failure of a project.
An agile approach can work
The Scaled Agile Framework for lean Enterprises (SAFe) complements big data projects or new development on new technologies. SAFe is a “way of working”, a framework that allows incremental, continuous, fast quality delivery of solutions.
In big data projects, there are a wide range of tools that can be used to do the same task. Choosing the right tool for the job is important and SAFe enables this via one of its nine principles.
The third principle states: “Assume variability; preserve options” – meaning teams don’t have to select and stick to a single design as indicated in traditional design and development lifecycle processes. Teams can maintain multiple design options for an extended period during the development lifecycle.
We often need to carefully unpack the requirements without discouraging or derailing the client from their vision.
Using SAFe, the combination of incremental delivery and ceremonies enables teams to quickly demonstrate a working sub-product to the stakeholders. When implementing new technologies, the possibility of changing direction is inevitable.
Teams need to plan, execute and fail fast. Keeping stakeholders engaged, at all times, is an integral part of the framework. The constant engagement allows product owners and stakeholders to have clarity of the product that needs to be delivered and removes uncertainty.
Risks, assumptions, impediments and dependencies (RAID) are identified and presented during programme increment planning. The importance of RAIDs and consistent communication cannot be stressed enough.
By way of example
I was once involved in an analytics project that took more than 18 months to be delivered due to licensing implications. The project approach protected “scope creep” (which can be a project manager’s worst fear) at all costs and did not allow any flexibility nor make provision for iterative deliverables, ultimately resulting in lost credibility and stakeholder trust.
By considering this example, it is important to be open minded to radical changes in modern technology-related projects and to use models that take various aspects into account, to allow for the scope to change and mould as needed.
In SAFe, risks can either be resolved by the team where this risk inherently lies; owned by a team member for a duration while the problem is not resolved; accepted by the team or product owner, or even accepted at programme level when the risk cannot be resolved and accepted for what it is; or mitigated.
In the earlier years of my career, having a bedded down and signed solution requirement specification was a prerequisite to starting a project. This led to much back and forth with clients that were unfamiliar with the technology that was planned for implementation. I recall creating over 15 system requirement specifications in a space of 12 months and only three were accepted.
All three were three- to six-month projects, scheduled to start the following year. It took 12 months to do “pre-planning”, probably another month for actual planning and at least another three months before the business realised the value of the solution. This is an extreme example as I believe most organisations have improved in their project approaches.
All companies that want to embrace new tools and technologies either want to have a competitive advantage, optimise business processes, reduce operational costs and/or increase revenue. Taking extended timeframes to deliver value is not only detrimental to the appetite of the stakeholders, but also deprives the company of quick turnaround times to meet strategic, tactical and/or operational goals.
In my opinion, these projects failed before they even started. That is why choosing the correct project/product delivery framework is of paramount importance to the success of implementing new technology projects.
The scaled agile framework is a definite candidate for incremental execution and continuous deployment that results in quicker proof of value, especially where new technologies are involved.
However, it is important to note that the appropriate approach (plan-driven, agile or hybrid agile) will depend on various factors related to organisational circumstances, the type of initiative, and importantly, the maturity of the team.
Windsor Gumede is principal consultant at PBT Group.
He is a self-motivated, results-driven principal BI consultant with 10 years’ experience in data and analytics. Gumede has worked on numerous data and analytics projects in Africa and the Middle East.
Throughout his career, he has played different roles, from ETL/ELT development, to data modelling, front-end development, solution architecture and design, to pre-sales consulting. The majority of his experience comes from the telecommunications industry, but he is currently maturing his knowledge in the insurance space using big data technologies to help insurance clients comply with regulatory requirements.Gumede is a strong believer in the core fundamentals of enterprise data management. “I see a huge gap in South Africa with technical resources that have skills in the big data engineering field but don’t have the proper grounding on enterprise data management principles. Skills on tools and technology without the literature is ineffectual.”