4 Ways AI Analytics Projects Fail - How to Succeed
“How do I de-risk my AI-driven analytics projects?” This is a common question for organizations ready to modernize their analytics portfolio. Here are four ways AI analytics projects fail—and how you can ensure success.
Artificial intelligence (AI) will offer a tremendous benefit to businesses modernizing their analytics tools. Many enterprises are already gaining valuable insight from analytics in some form—with traditional business intelligence, automated reporting, dashboards and more. Yet decision-makers may find themselves in uncharted territory when considering AI deep learning or machine learning capabilities. How relevant is AI to them? How best to proceed?
Even the most accomplished and experienced IT leaders have worked on a transformational project that failed at some point in their careers, and some understandably view advanced analytics projects with skepticism. The challenges of integrating data from diverse silos are well documented. I’ve identified four common pitfalls that can derail a project—and four corresponding approaches that help organizations avoid trouble and realize a successful project.
For many decision-makers, the thinking goes that risk can be reduced by piloting something early. Then, if it fails, at least we’ll fail early before making a significant investment or spending a lot of time. This view is based on experience with other approaches that expose limitations. Organizations that have adopted one or more of the following approaches, however, can raise the risk of failure past an acceptable level:
1. Data first: Starting off by collecting all the available data and then determining how to use it can be tempting. The problem is that any organization may have petabytes of data, only a fraction of which has true value to the business. Placing it all in a large data lake, for example, may not necessarily lead to failure, but it can use a great deal of energy without any assurance of positive outcomes.
2. Candy store: Pursuing several opportunities at a time is tempting because of the large number of AI opportunities available. Unfortunately, pursuing many opportunities this way dramatically increases the risk of failure because it dilutes effort, and may increase the complexity of implementation.
3. Clarity later: Starting out by building out an AI capability with the intent to gain clarity on the question or objective later can lead to avoidable risk. If the insight to be gained by the analytics project doesn’t clearly demonstrate how it’s expected to provide business value, hitting the target is all the more challenging.
4. Payoff later: After investing substantial time and money in an initiative, only to realize that the question went unanswered or the gains unrealized can court failure. For example, you may risk loss of support from key stakeholders who needed a more expedient return. Getting sign off for future projects might become an uphill battle.
At the end of the installation, if it is successful a helm release gets created with name specified, we can monitor this by click on “View Helm Releases” button as shown in the previous section or go to Menu >> Workloads >> Helm Releases as show below and select the release
4 analytic methods that avoid artificial intelligence failures
How can organizations de-risk their transformational AI analytics projects and help ensure successful outcomes? Four strategies can help organizations avoid the common pitfalls I’ve outlined when seeking the AI insight you need to boost business value:
1. Get clarity on the business question. Bring together a team of stakeholders that possesses business and analytical skills and apply critical-thinking techniques to question your suppositions. What does success look like? Where does the data come from, and which decisions might it support? How does the organization integrate this new insight into operational processes? The team also needs to determine what form the analytics data should take so that business users can consume it.
2. Enable faster exploration. Use a data science toolkit to assemble an ad hoc workflow that is tailored to the specific problem identified. Use IBM PowerAI™ and Python tools for data ingest, visualization, math libraries and so on to quickly assemble this workflow and data flow to enable fast, early development of an analytics pipeline.
3. Empower a quicker win. Use a data science sandbox to create a prototype and scale from notebook to small cluster to justify a project, rather than using a PowerPoint graphic. If an idea is successful, the prototype provides an easy way to demonstrate benefits and fast scaling to a production cluster or cloud. Leaders can gain confidence in the idea early, rather than waiting for the fully developed analytics capability to be demonstrated.
4. Scale to production. An AI grid can be an effective method to scale your prototype into production. The AI grid is a scalable multitenant cluster with high stability and high-efficiency scheduling. A DevOps team can use the grid to turn the data scientist’s prototype into a hardened piece of software for training data models. The grid also provides a place where models can be updated based on feedback and maintained for operating the business.
IBM, together with IBM Business Partners, offers the tools and capabilities to help organizations implement these methods. Using AI analytics to gain new insight and innovate more rapidly enables organizations to be the disruptors in their field of business, rather than waiting to be disrupted. To learn more about AI and transformational analytics projects, visit IBM AI Infrastructure and deep learning applications.
Browse categories
Share Blog Post
Pragma Edge (4.5/5)
4.5/5
- Published on