Written by Zdenek Svoboda |
Business intelligence software has been a tool for strategic decision making for years, but up until recently, its use was fairly limited. Not only was BI incapable of scaling to accommodate growing volumes of data or numbers of users, but scaling to vast data volumes, thousands of users, and fast innovation cycles was extremely costly. These issues made it difficult for BI tools to keep up with new business requirements as they emerged.
Let’s fast forward to today. The cloud has made scaling in all aspects significantly cheaper than it was before, and it’s now much more feasible as well. This has opened the door to entirely new opportunities for analytics, and it’s redefining how business is done.
New opportunities for analytics
Through the unique technical capabilities of the cloud, a number of new opportunities have cropped up for analytics. First, it’s now much easier to scale strategic decisioning to a large audience that spans regional and organizational boundaries, so you’re able to deliver analytics to everyone who can benefit from it.
Second, the cloud has enabled the deployment of extremely robust in-context, actionable analytics that drive business processes. Recommendations and automated decisions are embedded in the applications that business users use every day, so they can see in real time what the suggested next steps are and continuously improve business performance.
These and other opportunities build on some of the use cases outlined in Gartner’s recent Magic Quadrant report, like embedded analytics—which supports a workflow from data to embedded BI content in a process or application—and extranet deployment, which focuses on providing analytics to external businesses, customers, suppliers, distributors, or other business partners.
If you’re interested in taking advantage of some of these opportunities, take a look at the report I mentioned above, but also ask yourself if you’re able to pursue these opportunities using your existing analytic tools. Can you afford to wait for Tableaus, Qliks, and PowerBIs to fully migrate their desktop analytics offerings to cloud? I’m willing to bet that the answer is “no.” If you want to get ahead, stay ahead, and capitalize on the opportunities that the cloud has presented, then you need to shift gears.
Requirements for migrating to the cloud
If you’re thinking about taking your company to the next level in terms of its analytics and cloud capabilities, then you’ll need to build or purchase a particularly robust solution to do so. As you’re evaluating your options, keep these requirements for your application in mind.
1. Able to handle large data volumes
No longer just for a handful of internal managers, your new solution is for all of your customers, whether you have hundreds, thousands, or hundreds of thousands of them. Those thousands of customers each bring their own data. There’s just no way that Tableau can handle the data volumes necessary for all of them. Instead, you need an application built specifically to handle ever-increasing volumes of users and data.
2. Large concurrency and low latency
In your existing BI, you probably rarely see more than a handful of users using analytics at the same time. Extranet and embedded use cases require much higher user concurrency because you’ll see hundreds of concurrent business users. Can your data warehouse sustain all of their queries? You’ll need a powerful solution to handle this many users at the same time.
BI was always somewhat critical, and outages for a day or two were annoying but not the end of the world. However, embedded analytics is truly business critical. If your application fails or stops responding, then your business processes break down immediately as your users lose access to the insights they’ve come to rely on. To avoid this risk, you need to operate the analytics under the same strict SLAs as the business applications.
If you’re scaling to thousands or hundreds of thousands of different users, then you also need to protect all those users’ data. Can you put all the data in a single database and rely on a fragile SQL filters, or do you need to invest more in data security? Can you put all data in a single Redshift database? I think this is too risky. Consider storing each customer’s data in a separate database and leverage all the standard data access and user management features to prevent a data breach and unauthorized access.
With hundreds or thousands of different customers, an immense amount of time and resources would be required to build out independent experiences for all of them. Instead, you’ll want to build and roll out one multi-tenant solution for all of them and treat it as a product that you keep enhancing in agile iterations. This is a near-impossible goal for Tableau, Qlik, or PowerBI, which don’t provide management tools tailored for such large-scale rollouts.
6. Business user usability
With large-scale deployments, you don’t have the luxury of hiring a BI analyst, because you’d need hundreds or thousands of them. Even if you were to hire all those analysts, you still wouldn’t be able to keep up and deliver all the custom reports that your users across all organizations need.
Scaling to this many users requires a strong semantic model that drives consistency, and reusable and context-aware metrics that users just drag and drop. The governance tools that were designed for the limited audiences of desktop analytics—departments and small strategic teams—can’t deliver in large-scale deployments.
Are you finding yourself in need of changing how you deliver analytics? Are you tasked with implementing these new use cases? Do you need to rollout this out in the next six months? Check out our platform page or whitepaper for more information.
Need more detail on how to implement these new use cases? Visit our developer portal for more. If you’d like to have a conversation about your challenges, get in touch with a member of our team. We’ll work with you to answer your questions and find a solution that works for you.
Written by Zdenek Svoboda |