As at the beginning of each year, Piano gathered its partners at its offices on rue Taitbout. An invaluable opportunity to move beyond operational projects and reflect on the evolution of our business.
Many thanks to Samia ABARA (VP Partnerships) for her warm welcome, and to Cédric FERREIRA (Chief Product Officer) for a presentation that was dense, lucid… and very much in line with what we observe in the field.
🌍 “The world no longer changes in 2 years… but in 2 months.”
Between new AI models, protocols like MCP, LLM integrations and constant acceleration, the technological environment is evolving at an impressive speed.
But beyond the “wow” effect, an interesting observation was shared:
📈 AI adoption explodes.
📉 The abandonment rate of AI projects too.
Many companies have tried. Fewer have actually transformed these initiatives into measurable business impact.
🤖 AI in analytics: two major gaps
1️⃣ The context gap
AI doesn’t need more data. It needs better context.
Counting actions is not enough. We need to understand the environment in which they occur.
A simple example: a drop in video consumption on a website. A problem? Not necessarily, if usage shifts to replay or podcast.
👉 Without context, numbers can be misleading.
2️⃣ The outcomes gap
Analytics tools need to deliver more than just dashboards. They must produce concrete results.
SaaS is not “dead”, but it is being challenged: customers expect value that is more integrated, more activatable, more directly linked to business issues.
🧩 The Piano answer: Data Excellence & Data Empowerment
Two pillars structure the roadmap:
🔹 Data Excellence
Structuring models, standardizing approaches, facilitating implementation.
Featuring :
- Data Sources by industry (media, retail, banking, travel, healthcare…)
- Data Source Studio to accelerate the implementation of a consistent tracking model
- Real-time implementation monitoring
- Automatic detection of existing models for existing customers
The aim: to transform weeks of framing into a few clicks, and give AI a clean, coherent framework.
🔹 Data Empowerment
Circulate data throughout the company.
Some key developments:
📊 Workspace: ready-to-use workspaces based on activated data sources (and gradually the end of Explorer). A more dynamic logic, adapted to each organization.
📩 Scheduled exports with IA summary: to move from a passive to a proactive logic.
🔎 Data Query redesigned
Another major project:
- Integrated natural language
- Conditional formatting
- Bucketing on the fly
- Allocation directly in Workspace
- SQL generation for Snowflake environments
A clear desire to give teams greater autonomy without making use more complex.
🔗 MCP: connecting LLMs to real data
With MCP, you can :
- Querying data from your favorite LLM
- Automate workflows (analysis → Confluence page creation → Slack sharing…)
But with a controlled approach: ✔️ No cross-customer analysis ✔️ Siloed data ✔️ Possibility of AI opt-out according to internal constraints.
💡 What we learned from it
In today’s fast-paced environment, the temptation is great to “put AI everywhere”.
The vision shared here is more pragmatic:
- Structure before automating
- Giving context before interpretation
- Producing value before producing dashboards
An approach that echoes many of our current discussions with customers.
Perhaps the real question is no longer “how do we add AI?”. But rather: how do we make our data ecosystems truly exploitable and activatable?





