FELD M in Conversation with Jesse Rothenberg, Head of Data & Insights @Ableton
10.2.2026
Eilish Prioul
What you'll discover
- Jesse's career path
- User research in Ableton's Data & Insights team
- Quantitative data and qualitative insights
- Building data literacy at Ableton
- Working on the bridge between business and technical
- What makes a great analyst
- Managing stakeholders
- Recommendations for learning and development
- Predictions for the future
- Closing insights
FELD M in Conversation is a brand-new series of interviews featuring interesting people from the data and analytics industry. It’s essentially a fireside chat (delivered to your inbox, if you sign up here). A space for people to share their experiences, approaches, and personal perspectives – going beyond frameworks and theories.
We hope that you, The Data Collective, will find voices that resonate and inspire.

“Technology and mathematics are neither the hard nor the important part. The important part is the people. That's why we have an economy, why we work, why we are in business, why anyone is in business: to serve people, not to serve machines, and so the value we add, we add to people. Value doesn't exist in a vacuum; it's a social construct.”
Jesse Rothenberg Head of Data & Insights | Ableton
Berlin, at last, no longer resembles an ice rink punctuated only with gravel as Jesse Rothenberg and I meet at the little café in Friedrichshain. Sunlight drapes itself across our shoulders as we order our obligatory flat white and cappuccino, and settle in for a chat.
Jesse Rothenberg is currently Head of Data & Insights at Ableton, the German music software company behind artist/musician software and hardware staples, Ableton Live and Ableton Push. Based in Berlin for the past 15 years and forever on a quest to find the city’s best taco spot, his background is in software engineering and statistics, and he’s been building and leading data teams for the past 16 years. Working at Ableton, he’s uniquely positioned at the intersection between creativity and data, bringing a fresh perspective to what “Data & Insights” can encompass. He’s also generally just a fun, nerdy human that I learn a lot from.
Ableton is pretty unique in that it’s a business operating in a creative space, built for creatives, and built by a team replete with musicians and creative types. Half of Jesse’s team members make music themselves in some capacity.
How does this creative environment shape his role? As we’ll discover, it impacts every aspect, from the structure of his team and the very types of data they work with, to the approach he takes to integrating data into the company culture and getting big projects across the finish line.
Starting at the beginning: Can you share a little about the path that led you to your current role?
“I started my career as a software engineer in the late days of the first dot-com boom, and discovered (via some very grey days in Finland) that I was ready to switch things up.”
Following graduate studies in economics, he moved to Berlin and took on his first data-related role as a marketing analyst. In that role, he helped rebrand, rename, and build the function into a BI team. A time away saw him build, run, and reach cash-flow positive in his own startup with his business partner, before returning to the data world.
“From there, every role has involved additional team members, bigger scope, further growth. My current role extends well past BI, touching on many other topics of data governance, management, quality, but also user research.”
User research? That’s not something I often hear about as part of a data team. Is that something that you would have always seen as under that umbrella, or was it something that developed out of the Ableton company structure?
“Having qualitative data (like user research), and not just quantitative (structured) data forms, under the roof of Data & Insights at Ableton has been really positive. It’s enabled us to use some of the qualitative discoveries to drive prioritization, and to use some of the quantitative knowledge to guide further research – it’s been a virtuous cycle. I think we've been able to unlock a lot of value as a result.”
We briefly touched on it earlier that Ableton has creativity at its core. In a product-led company like Ableton, how important is quantitative data compared to qualitative insights? Are both necessary for success?
“Ableton is special in that it has always had a strong connection to its target market; the product leadership and designers have been very good at understanding what the market wants. And so it's therefore unsurprising that their first views into ‘how can we learn more?’ were qualitative.
There was only a limited precedent for more quantitative analysis when I joined the company – demand for BI leaned more in the direction of finance and sales than product.
I’ve been building that understanding of and reliance on quantitative input from the ground up, and that’s not something that comes naturally if the company doesn't have quantitative work in their DNA.”
How did you approach getting data a seat at the table? Is there something you’ve found universally helpful for fostering this understanding?
“You need internal champions across the company–people naturally aligned with you, who want numbers, and come to you for help and information. Find them, support them, and empower them to advocate for you within their teams. By demonstrating how much more effectively they are working now that they're working with you, you essentially let the results speak for themselves.”
Eilish: It's a bit like finding product evangelists in that sense, but internally?
Jesse: Right. It's the same thing. It's about figuring out who's interested, who's engaged, who wants your help, and then multiplying those voices.
Can you tell me more about how you’ve tailored your approach to building data literacy at Ableton specifically?
“Your approach is always going to vary based on the company, its culture, its history, and what the historical organizational traumas are.
Different organizations will have different organizational traumas. Some organizations might have had a release that was intended to be a complete rewrite of the core engine that went terribly, and the release was delayed for two years, with disastrous consequences. And it makes everybody very cautious about making big changes in the future. That might be an example of an organizational trauma. Or there could be smaller organizational traumas that make people want to avoid being too number-driven, like the story where Google famously A/B tested 41 shades of blue for a button, and their Visual Design Leader resigned over it.”
On a personal level, Jesse is deeply privacy-focused. What he wasn’t aware of initially is that his colleagues are, too. He joined Ableton around seven years ago, at a time of increasing awareness of the responsibility of handling people’s data, and of what many other companies were doing with that data. But that wasn’t obvious to him from the beginning, and several of his early attempts to work more closely with the product teams stalled as a result. His colleagues were cautious about using, say, product usage data to drive improvements.
So how did you address it?
“The important thing was to take a step back and try to understand the source of why people found this challenging. I interviewed a number of my stakeholders and learned that people had concerns and were really worried about how we were going to handle the ethics of data collection, more than they were worried about us using data to tell them how to make decisions about product development.
This ultimately led to us prioritizing a data governance initiative and putting concrete data governance standards in place.
Ableton is a very consensus-driven company, so we addressed that by establishing a cross-organizational work group of 22 team members to tackle the question of how we handle data at Ableton. The two most important outcomes were:
-
Creating a data-handling framework
This defined how we work with data along its entire lifecycle (because the end of the lifecycle is important too!), and
-
Creating a data ethics framework and guidelines that went beyond the regulatory requirements.
This outlined how we work with data from a data ethics perspective, and it really helped us in unblocking and unlocking a lot of the opportunities to keep moving forward.
It’s not what all companies need, but it’s what Ableton needed for that extra reassurance. Our ethical guidelines are a helpful resource to point to when we get new questions on the topic, and the fact that it had buy-in and sign-off across the organization was culturally important to get it across the finish line.”
Beyond cultural factors, what else does a data team need to be successful within the organization? Would you say that C‑level support is the most important factor?
"You will fail without it. Necessary but not sufficient, to use a maths term!
I can't stress how important it is to have executive sponsorship that delivers in order to guarantee success. While there are a lot of things you can do to try to influence culture and change it, it's a long, slow process. If the executives are not standing behind you, you will have a much harder time succeeding at it."
Since executive support is so crucial, how has your role shifted over time to accommodate that? How do you manage and experience your role on that bridge between business and technical?
“First off, it's a small team, so I (unofficially) wear many hats, spanning strategic development, tech lead, people manager, product owner, and acting as the interface between the rest of the leadership group and the team itself. So indeed, it's a challenge to balance all that.
It's really about being consciously aware that you're making a choice about where you're putting your time. And while you can probably allow yourself to put some hats down for a meaningful amount of time, you've also got to be aware of what the cost of doing that is.
My personal balance has shifted over the years as the team has changed in size and scope. And as the maturity of the organization has changed, I've spent less time doing engineering work or (technical) leadership, and more time on strategic direction.”
Has that shift made you rethink what a “great” analyst looks like on your team?
“I'm increasingly convinced that a strong communicator in an analyst role is extremely valuable. Effective communication of quantitative information is a challenging topic, and it's a special person who has the verbal and visual skills to be able to communicate that information effectively.
I would see huge value, for example, in having a team member whose primary role or responsibility is something along the lines of creating a one-page infographic that communicates the outcome and output of everything else that the team is working on.”
Around three or four years ago, I saw an increased demand for “data storyteller” roles, combining design and statistics to communicate data visually. It seemed like it was a bit of a trend up until the AI craze hit. Has GenAI influenced your hiring approach?
“My interviews have generally never involved coding challenges or take-home work for applicants. The question I’m instead trying to answer in interviews is ‘How does this person learn?’. It takes a particular mindset to want to learn. I try to only hire people who have that mindset, and I don't know that that would change with any tool advancement.
I have some scepticism about trends, having seen quite a few come and go over the years. It doesn't mean I'm necessarily against or avoidant of new technologies. But I would always prefer to hire someone who understands the nuts and bolts of how something works than someone who has mastered a specific framework.”
Do you think great analysts are taught, or do they develop a feel for the work over time?
“Both are equally important.
It’s a broad field, and there’s a lot to know: languages, skills, methodologies – and that's just the hard skills side of things. And there are a lot of little caveats that you just have to know, and you only get that through exposure and experience.
It can be important for an analyst to understand, for example, what the differences are between fixed and random effects, or how to construct a hierarchical mixed model. But it's just as important, if not maybe even more important to most of your stakeholders, that you understand the business context.
It’s also vital to know when to keep things simple.
Often, more junior analysts will know the perfect technique that will control for all the biases. And then they’ll spend two months working on it, and come out with something really incredible, but actually your stakeholder just wanted an estimate that was just okay, 10% either way, and they needed it two and a half weeks ago.
So you need a level of learned pragmatism and business sense, and a big dose of the ‘keep it simple, stupid’ principle. It's as much about knowing when to break open that advanced toolbox as knowing when not to.
The more experience someone has within a particular domain, the more value they're going to be able to contribute back to their stakeholders within that domain. And the longer they've been working with a team, the more camaraderie there'll be, the better understanding there'll be, the easier and faster it will be for the stakeholders to get the answers they're looking for. Because stakeholders often don't ask the right questions.”
Let’s hang out there for a moment: Stakeholders often don’t ask the right questions. How do you tackle that?
"One of the hardest things to train an analyst in is that it's okay for them to be the expert in a conversation with their stakeholder. And that their role as the analyst is to communicate with the stakeholder, to get to the bottom of:
-
What is the question they're really trying to answer?
-
And what are the metrics that can help answer that question?
And to use their own business knowledge, domain knowledge, and statistical knowledge to drive the conversation in that direction, rather than just thinking ‘my stakeholder wants to know x’ and taking that at face value.
There will be a lot of stakeholders who will need training in how to ask the right questions, and it's important that you bring them along with you and give them the opportunity to learn."
Enter stage left: The sound of drilling, as the café’s neighbors begin some perfectly-timed DIY.
“God, that’s annoying. Shall we move —”
Safely reinstalled into the opposite corner of the café, where the drilling feels just slightly less like it’s going directly into our heads, we pick back up where we left off.
Speaking of learning, which content do you still trust for learning and development?
“Some classics that are still relevant today are:
-
Edward Tufte: The Visual Display of Quantitative Information.
-
Richard McElreath: Statistical Rethinking
-
Peter Kennedy: A Guide to Econometrics
-
The statistics blog by Gelman and co at the Columbia University Statistics Department is outstanding.
Events-wise, I enjoyed the people and conversations at the Chief Data Officer Network (CDO Network) in Frankfurt last year, and the panel I participated in at the Chief Data & Analytics Officer Conference (CDAO) in Munich covered both an SMB and an enterprise perspective – we were able to uncover some interesting parallels with Microsoft.”
Predictions of the future are always difficult and often wrong, but what’s your guess about how our tech landscape and the job of analysts and engineers might change?
“I think the important things are going to stay the same, and that's communication, people skills, and stakeholder management. In a scenario in which more of the work becomes automated, those will be the things that set people apart.
If the proponents of GenAI and the people who are singing its praises for all things engineering are correct and successful in what they're building, there's a world in which stakeholders will expect to be able to ask a machine a question, and that machine will have the answer, and all the data at hand.
If that happens, I see the role becoming less mathematical, less technical, requiring more understanding of the data domain, data cleaning, and data management, and on the other hand, requiring stronger people skills.
One last question: If someone only reads the first two minutes of this interview, what's one thing you'd want them to understand about working with data?
“Can I have two things?
Technology and mathematics are neither the hard nor the important part. The important part is the people. That's why we have an economy, why we work, why we are in business, why anyone is in business: to serve people, not to serve machines, and so the value we add, we add to people. Value doesn't exist in a vacuum; it's a social construct.”
The other is a quote from George Box, a famous statistician from the 1970s:
“All models are wrong, but some are useful.”
“It's important to remember when you're presenting your results to anyone, to any stakeholder: you don't have the truth, you have a perspective on the truth.”
Jesse: At the risk of getting metaphysical here: what is truth, Eilish? What is the truth? What is a fact?
Eilish: I'm going to order another coffee now and have an existential crisis.