Envisioning a future where dashboards stand the test of Generative AI in Digital Analytics
Lately, I've been asking myself some questions about the future of data visualization in a world where generative AI is becoming increasingly important.
Generative AI in digital analytics
We now have many solutions of projects of solutions that integrate AI features to:
Dashboards
Ah, dashboards...some even say that dashboards will no longer be necessary, that it will be enough to just ask "the AI" questions and get the right answers.
For now,
What's the synthesis of the two? ...dashboards and data visualizations.
Analytics transformations with generative AI, under conditions
And thinking that we only need to be alerted is to forget that our profession, our value, our vocation also consist of absorbing knowledge, context, and demanding that we sharpen our minds. Scan, understand, explore, find, confirm.
We want the journey AND the destination.
Some will say we already have plenty to work with... but I wonder if we might risk losing, without realizing it, what drives us, and sometimes what inspires us, elevates us, makes us creative.
My words are deliberately exaggerated, but it seems to me the risk is very real:
Moreover, we also know that using generative AI necessarily requires maintaining a form of control, the ability to perform intermediate checks, and therefore by no means being content with a black box into which we feed requests and receive ready-made solutions in return. No magic, no reliable black box, no rabbit out of a hat. Quite the opposite, and we've been experiencing this for a little while now. We must follow the rabbit down the hole.
If there is one essential reflex when it comes to adopting AI within Analytics activities, it's to preserve our analytical mindset. It must come back with even greater force, be ever-present.
Data visualization and generative AI
Let’s return to data visualization, in an environment controlled by the digital analyst and conducive to knowledge acquisition and intellectual growth.
Respecting the following usual principles greatly simplifies the participation of AI:
Critical bottleneck of historical BI and data viz tools
Looking ahead at data visualization, one cannot help but notice a critical bottleneck in many BI and data viz solutions: a WYSIWYG editor designed exclusively for humans, with no option for programming, and therefore no easy automation:
For these tools, removing these limitations will be crucial in the medium term. They will have to move beyond the user who clicks and waits. The challenge is to be able to generate reports, or report prototypes, via AI agents orchestrated by more or less detailed, multimodal instructions.
And this will have to be done by opening their software to protocols such as MCP (Model Context Protocol) or by developing their own proprietary protocol for programmatic report generation and AI agents.
"BI as code", episode 2: On steroids with generative AI
In this context and at this stage, a family of BI tools seems to stand out, thanks in particular to the mindset in which they were designed a few years ago: BI as code.
All the benefits of industrialization and software code controls, applied to data display. Your reports are generated through configuration and layout files.
In short, a very favorable context for adding your AI agents to generate structured reports by following your instructions, leaving you only to fine-tune the AI deliverable so that it becomes a business deliverable.
As part of a lot of advantages, this would be a way to include AI in your data presentation layer without it, or its results, haphazardly replacing the interactive reports and dashboards we currently produce manually.
Another obvious benefit is democratizing code-based solutions for a wider audience. Since code is currently the entry point for this type of visualization generation tool, it can easily be enriched through a graphical editing interface, a generative AI chatbot, or an automation program...or even all three at once.
BI as code for digital analytics and digital marketing
On the other hand, certain current but surmountable constraints prevent or slow down the rapid adoption of "BI as code" or "data viz as code" in analytics and digital marketing in general. But many benefits encourage us to consider its use in our activities, including as example the availability of solutions in Open Source form with GPL licenses, and even MIT licenses for some.
Everything suggests that this trend, born in the BI world, will branch out into the digital analytics field. In light of this, the historical providers of data visualization and BI solutions face significant challenges to overcome the generational limitations “by design” in their tools. They’ve moved some of their pieces forward, but the road ahead is long.
Final thoughts
We will be here to implement the best strategies, with the most suitable solutions, for the benefit and sustainability of our projects.
In the meantime, the time for experimenting with available solutions is well underway, and we will certainly have the opportunity to discuss the changes, still somewhat tentative at this stage, in data visualization for digital analytics, but which are likely to bring about greater activities transformations in the future...just like the rest.
This article remains an appetizer regarding possible solutions for the best collaboration between generative AI and data visualizations for the sake of our deliverables. There is so much to say about the benefits and the operational means to ensure them.
On a personal note, I’ve started benchmarking and testing solutions and will probably share some examples in the future. So, see you soon for the next chapter of these adventures.
Marketing Consultant | Founder at Red Ink Community
1moI'm glad you spoke about the need for following the rabbit down the hole. My gut reaction to "no dashboards"/a query only approach was to think about how many of my new ideas or the result of observing "irrelevant" data.
Data Analyst @ Knewledge
1moGenerative AI in digital analytics can be fun enough, but when it comes to displaying metrics in dashboards we need the numbers to be predictable and consistent (even if they are consistently wrong! 🙂 ). Soon enough you will have a little Gemini chatbot in your Google Analytics property, ready to answer any questions you may have. The problem is that the answer will very much depend on HOW you ask the question, and two questions that may look very similar can actually return two completely different results. Sometimes the same metric can already mean different things to different people, this issue will be made much worse with generative AI. This is why hand-built dashboards, custom fields and SQL queries are not going anywhere for the foreseeable future. Maybe once LLLMs are truly intelligent this will change, but for the time being they are still too unreliable and unpredictable. You can still get a lot of benefits from using LLLMs, but they usually require you to set up quite a lot of rules and constraints to get consistent results from them.
Digital marketing & analytics shaped by data governance, privacy and ethics | Educator · Speaker · Consultant
1moA few years ago - well before the GenAI craze - I advised a startup (now defunct, though ahead of its time) that could ingest multiple data sources and automagically detect subtle shifts likely to impact KPIs - and explain it in plain text, with graphs, and digging abilities. Dashboards are fine for surfacing the known knowns - the KPIs we’ve already decided matter. But what about the signals buried in the data that we don’t track, the ones that eventually shape outcomes? By the time they show up in a dashboard, it’s often too late. That’s why I believe we’ll soon see a clear split: dashboards designed to reassure and reinforce internal narratives, versus intelligent systems that spotlight what truly deserves attention.