From click tracking to actually understanding the conversation, how AI will finally deliver on the promise of BI
Business Intelligence (B.I.) is the infrastructure that your business uses to collect, organize and manage all the data related to it. Everything from simple spreadsheets to the more thorough dashboards, all come under the umbrella of B.I. While B.I. has been a fundamental part of all operations and business strategy decisions, it is traditionally used only to present data in a more readable way. This approach fails to provide any helpful insights. As Michael F. Gorman, professor of operations management and decision science at the University of Dayton in Ohio, said in an article published by CIO Magazine, “[Business Intelligence] doesn’t tell you what to do; it tells you what was and what is.”
As this space has been evolving rapidly with the adoption of AI, we have been looking at some of the ways in which Artificial Intelligence is enhancing the simple business intelligence tools into something more powerful and talk about the upcoming avenues in this field.
Where is the B.I train headed?
Amongst the several important changes that have come up with the adoption of machine intelligence, one important aspect is the expanding sources of data. While businesses are taking into consideration most of the new technologies like IoT, click tracking, RPA systems that provide waterfalls of useful intel, a lot of effort is spent in discovering further new avenues of data sources.
With the increase in the number of data sources, the next stop on the data train is at the junction of big data technologies. The need for advanced AI to analyze those amounts of data has been continuously voiced and there are efforts being taken towards providing it. Apache Hadoop and Spark are some of the most popular open-source frameworks to store and process Big Data in a distributed environment.
In its contribution to upgrading business intelligence, one of the most common use cases that we have seen in terms of ROI and adoption is the- Predictive business analysis. This is where tons of historical data is analyzed to predict future outcomes – one of the most common use cases is the implementation of Next Best Action in telcos for call centers.
One such real, small and impactful use case could be a call center agent using the historical data of past appointments to reach out to the customer for booking the next appointment.
As we continue to explore the avenues of real-time B.I, conversation analysis systems will become crucial parts so that these predictions based on historical data can be complemented with real-time customer conversation experiences – topics, issues, etc.
Conversational Intelligence – the third eye of B.I.
Sophisticated conversational analysis platforms can contribute immensely to the data sources. Speech to Text platforms have already existed for a while and are in fact started to get commoditized now, but extracting the nuggets of information and insights can change the real-time B.I tools can deliver value to organizations, now making B.I beyond “what is” and “what was” and more indicative to “what can be done”. This is where B.I becomes action-oriented.
Prophecies for the unleashed
The influx of data seen by businesses motivates the use of proactive, real-time systems for reports and analysis that can help with immediate alerts and updates can be especially beneficial in various industries.
Getting the meta-conversation data source: Conversations are the last mile of data that is getting lost today and is heavily underutilized, hence a system of record of the meta-level information on customer conversation statistics can give the horsepower to the B.I systems for real-time, actionable intelligence – getting customer sentiments on specific topics in the calls that leads to appointments not getting booked. Imagine having this flow into the customer support organization to influence real-time call center conversations.. just an example.
For supervised learning approaches, it is important to highlight the dependency of the quality of an AI model on the quality of data, which is why the exploitation of all the available data be utilized. Internal sources such as machines and software, meetings, client interactions as well as External sources such as website visitors, chatbots, feedback and review forums, competition conditions as well as demand trends reflected in social platforms are a few of the gold mines of data.
Consider infamous 80-20 rule even in data aggregation for B.I: It is an axiom of business management that “80% of sales come from 20% of clients”. With tonnes of data, it is important to identify which part of it is the most important. Whether this identification is done through AI or domain experts within the company will be decided by the cost factor and reliability of both- but either way, it is definitely something to keep in mind.
Beyond visual representation and the Role of NLG in B.I: New developing branches of AI like NLG are having a major impact on the usefulness and accessibility of knowledge on BI tools, let us talk about NLG in Business Intelligence. Jon Arnold, Principal of J Arnold and Associates, in his article for Enterprise Management 360 says- “A key reason why BI is a strong use case is that these platforms provide visual representations of the data, but this isn’t always helpful for workers. Not all forms of data can be easily visualized, and visual outputs aren’t always enough. Sometimes a written analysis is what’s needed, and other times voice is the format that works best.”
More to come.. on specific use cases that are going beyond traditional B.I and how cognitive RPA and B.I are changing the way enterprises work.
TLDR and Some further reading/References: