Episode Overview:
Episode Links & Resources:
Good morning. Good afternoon. Good evening. Good whatever time it is wherever you are in this amazing planet of ours.
I am Malcolm Hawker. I’m the host of the CDO matters podcast. Thanks for joining today. Happy New Year by the time you see this, and we will be well into twenty twenty six.
I hope you had a wonderful holiday season. Hope you had a great New Year and that your new year is off to an amazing start. Speaking of an amazing start, I’m absolutely thrilled to be having the conversation today with Karthik Rovendran.
Karthik is the general manager of go to market for data and AI for a little company called Microsoft. Karthik and I have known each other for a few years now. I I think a few years is correct to say through partnerships with Prophecy, through the work that we’ve doing with Purview and with Microsoft and in the Azure ecosystem. We’re gonna talk a lot about that today. We’re gonna talk about AI in general. We’re gonna talk about some of the people impacts on on AI.
We’re gonna talk a lot about data and AI because there isn’t a smarter person in this in this space than than the person that I’m talking to. So, Kartik, with that, thanks for joining today.
Thank you so much, Malcolm. It’s so great to be on your show and congratulations on your data heroes playbook, the book that you just published. You’re It almost brought back my last three years life and journey like on paper and wonderful to see things like that resonating with leaders like yourself and the stories that you shared. So thank you for that.
You’re so kind. Thank you. I didn’t even plan a book promo, but you just did one for me. So that’s that’s that’s fantastic. So first thing, I just looked at your LinkedIn profile. You’ve been with Microsoft twenty six years.
That’s amazing.
I I can’t even imagine how much change you’ve you’ve seen in twenty six years. But but but first question. Take us back to, you know, twenty five and a half years ago. You’re a young Karthik. You’re you’re running around in Redmond with some of the biggest and greatest and most important names in computing. What was that like?
Yeah.
Oh my God. Tell me about that. I mean, when I started out my journey at Microsoft, I came in like a starry eyed youngster for sure.
And it has always been an aspiration for me to work at Microsoft. And my first role interestingly was actually a developer support engineer focused on data APIs and SDKs.
Oh, really?
Okay. So remember the days of ODBC and active data objects and all of the data libraries that Microsoft used to ship to build database apps for SQL Server. It’s like, I was the guy on the front line doing support engineering for almost seventy customers.
And one thing led to another, went from support to product management to engineering and then like about ten years into my journey at Microsoft, I found my passion, data. Then ever since then, it’s been data, data, data.
Right? So various different roles, leading data teams of product units at Microsoft, going on to do, I think, what was then one of my most enriching learnings, which was building and running Microsoft’s internal data office.
And then from there, took those learnings and went to Purview where I had the opportunity to build version one of the new Purview data governance solution, which went to market like a year and a half ago. Yep. And then now I’m in a dream role. The dream role now is what he introduced me as, which is leading the worldwide go to market and sales place for all things data at Microsoft.
This is amazing because it brings everything together, it gets me to work with customers, understand the challenges that customers are navigating, and just it’s fascinating to learn about how data is such an important thing for everyone across every industry.
And couldn’t be a more exciting time to be in a role where you get to share those stories and help customers be successful with their data.
Well, so one thing that jumped out to me, you did a tour of duty in product management
Which I think is critical. Well, it’s not critical.
You you can succeed in data and analytics or as a data leader without a product background, but it’s it’s I I think having that product footing, right?
Like, where you are where you are razor focused on solving customer needs and building great solutions for customers, I think makes for a better data leader. The data leaders that I’ve known that have had a lot of success on their common streams is having some experience in product management. So that’s a good segue into the concept of product data products as a whole. What is your perspective on data products? And, you know, there’s a lot of people that see them as, you know, just kind of a part of the medallion architecture and that’s fine. There’s other people who see them as an output of a product management process. Where do you kind of see data products fitting in the broader scheme of things?
That’s a great question. Yeah, I think data at times ends up being executed more as a project versus as a And especially when I had to go to the data office in Microsoft, it was one of the biggest change management curves that I have to drive because the organization at breadth hadn’t been in the product roles prior. And it was very focused on the operational execution and on running data initiatives as projects, ten box projects.
And, around that time is when this whole concept of the data mesh started to become really big in the industry with business units starting to become very savvy with data, wanting to own their own data charters and destinies.
And I think this whole concept of saying like, hey, great, like, we got to get to a place where the investments that we make in data are not just time bound projects, but are more valuable assets.
And bringing in the product thinking, but the way I like to look at it is like, think about data technology, just think about products in the general consumer sense. Think Think about it in a general consumer sense, I mean, products are meant to serve the needs of specific personas. They’re meant to be easily discoverable, understandable, and usable by the personas that they’re being built for, right? They’re meant to be easy to support. They’re meant to have SLAs around them.
And if you took all of those characteristics and now brought it to data, data deserves the same type of treatment. Right? To truly deliver data to the consumers of data, the more you can apply the product mindset in terms of making it relatable to the persona who needs to benefit from it.
Ensuring that the person can discover and use it to achieve their purposes, and making sure that the person has also supported with the SLAs or the service level agreements that they need to be able to effectively and safely use it, are all considerations that would apply to any product, not contextualized to data.
The key thing with data products is I think the grain in which you think about it, you know. There’s various grains at which people talk about, dashboards, reports, so on and so forth. But to me it really comes down to, there are personas for data consumption across the supply chain of data, right? A data scientist is a consumer of data. For a data scientist, a very well cleaned customer master dataset that they can take, rely on to go stitch and build machine learning models for prediction, churn prevention, other customer use cases is worth its weight in gold.
But for that data scientist, that clean customer data set is a product, right? The model that the data scientist produces could be a data product for the next user of the value chain, right? So it really comes down to understanding who the personas are, looking at it through the consumption lens for data for that persona and then tailoring a product that can live up to the bar of a consumer product for that persona. When you start applying that lens, think you start thinking about data in a different manner, right?
Which is not just as a time bound project to serve a certain time bound need, but more as an opportunity to create an asset which can then be self serveable, which can be consumed by wider audiences and which can also become the building blocks to create more higher value products which can then feed back into the supply chain, right? So that’s kind of how we approached it and navigated it and it was quite the mindset that we had to traverse, but once you were able to relate it to the users, the person that’s needs, and the value outcomes, it became more of a natural shift.
Yeah. I yeah. I love that. A few things that that that I really love what I heard was one that you really kind of stressed the customer, the persona. You used the word persona, but the customer. Like, who’s actually consuming this?
That’s critical. Another thing that that you certainly stressed on was value delivery. Right? That’s certainly important. And another thing that you stressed, which sometimes I think that we get really hung up on as data people, because we tend to look I think I think many of us are are at heart kind of librarians, and we wanna classify things into one box with a very specific definition and apply these kind of deterministic rule rules about it is or it isn’t. Sometimes we get hung up on the the fact that a raw material to one could be a finished good to another, or vice versa. Product managers, they’re like, hey, great.
As long as I’m delivering value, all good.
But a lot of the data people are like, oh, wait a minute. Is it a field, or is it a dashboard, or is it a API? Is it and if it it can’t be both a raw material and a finished good. It’s gotta be one or the other. We don’t get hung up on that stuff.
I love you gave a presentation a couple years ago You talked about the you were very open about Microsoft’s journey in in in this space. You just shared some of the highlights around your journeys with data products. But one of the one of the greatest presentations I I ever saw you give was that you were and it was I remember it vividly. It was on the top floor of a Hyatt hotel in Cambridge, Massachusetts for the CDOIQ conference. That would have been, I forgot, two years ago.
And you you just kinda opened up and said, here’s what we did at Microsoft, and you shared your journey.
What are what are like, you know, two or three of the biggest lessons that you’ve learned and and and what role did failure play in the learnings?
Hundred percent great one. I think the first thing needs to start with the biggest learning for me was I went into that role of the CDO thinking it’s largely a technology challenge because there was a whole bunch of infrastructure from the legacy that had to be reshaped. But about a month into the role I quickly realized that it’s anything but a technology challenge. The technology was interesting, there was suddenly hard problems to solve. But the people dimension and the change management dimensions that’s needed to make data truly successful in terms of its applied value outcomes were way more pronounced.
The cultural alignment that had to be navigated across the enterprise to not just view data as a valuable asset, but more importantly to align the operating model.
To bridge completely polarizingly opposite, I would say like spectrums of thought around groups of folks in a department like IT wanting to centralize everything versus the business units wanting to decentralize everything. And then trying to strike the balance between saying like, it’s not either or it’s an end. Things that can and should be common should be at the core enabling a responsible foundation, which doesn’t block but which accelerates the innovation at the edge, right? So this whole shift in mindset practice model from the center of excellence COE to the champion of enablement COE for a data function and to bring in the wider organization to partake in the supply chain was I think one of the biggest learnings for me personally, right?
And this is such a big difference coming from the product background that I wasn’t before where in the product team it’s all like, hey, you own it, you build it, you run it. But this year it’s like, got to galvanize and bring together the entire ecosystem and enable them to go in a way with the data. And that’s the core part of the challenge. So that was like learning number one and then the change management that goes with it, you just develop an entirely different appreciation for the people and the practice dimensions of a data journey as opposed to just the technology, right?
So that I would say was learning number one. Like learning number two is just the importance of recognizing that there is no one data team that can do it all for any organization. Right? And oftentimes during that journey I also used to talk very frequently with customers present at EBC sharing the Microsoft data journey and interestingly at that time several of our conversations used to start with, hey, the CD used to come in and say, I need to control data.
I need to own data, you know, and help me figure out how do I build a world class center of excellence, right? And that leads into the conversation of, hey, rethink the COE to the first point. You’re going to be very more successful being a champion of enablement versus trying to be the center of doing it all. And here’s the reason why, it’s because the context is needed to apply data really well, sits across your organization.
And the importance of recognizing that the true success of a CDO is to bring together that rich context across the organization and then do activate users across the organizations to be able to discover, understand, and apply data for their use cases, while still providing them a really solid foundation, which ensures that they stay within the safety parameters so that they can truly innovate without violating any of the compliance considerations that they have to stay compliant with. I think is a great opportunity for CDOs and data leaders to not just build a responsible data foundation, but to also help ensure that the foundation activates its core purpose, which is to drive the acceleration of the agility and innovation at the edge.
And then recognizing that the skills and the experiences and the context that you need to do all of that is an organization wide team sport, right? It’s not an individual sport. So those two changes and navigating that ecosystem along with, of course, several of the technology challenges that we can also talk about, I think are some of the biggest learnings for me personally in that journey.
Yeah. I I love it. Something that you stress there, which is gaining a little more traction, which will be a good segue into the conversation around AI, was this notion of context. Right? What what marketing needs is different than what finance needs, and they can both be right at the same time. Something else that you just stressed, which is so so important, is that alignment in the operating model.
And you highlighted it, which was, you know, hey, center of excellence, great.
Command and control, fine. More centralized data management, fine. But if your organization is inherently domain autonomous, right, where the domains have a high degree of autonomy, which is how it is at Microsoft and many many other companies, If you try to enforce this kind of top down operating model that doesn’t align, you’re not gonna have a lot of success. You’re gonna run into a lot of barriers.
And and I’ve certainly experienced this in my past. That’s something that I really really appreciated in the story that you tell, which is, you know, make sure that your approach aligns to your your overall organizational approach and deal with some of the imperfections that may come with that. Yeah. Right?
But it’s your business strategy. If your business strategy is to allow, you know, allow domain autonomy, you’re not gonna change that as a CDO. And you may have a couple of different customer master records when you’re searching in your catalog for customer. You may see three or four, but that’s a function of how your organization is aligned.
Not any sort of broader failure. Do you agree?
Hundred percent, hundred percent. The organization operating model has to meet the organization at a starting point. Yep. And there could be opportunities to improve that to evolve it but not to just go in there and brute force try and replace it on data, right? And then once you establish that starting point connection especially with the stakeholders all of whom have to be successful with your data.
The openness to wanting to take a journey to improve towards greater opportunities will come naturally. You can’t force it, right? I think well said, yeah.
Okay, let’s get it back to context. And, one of the announcements that I was This shows how nerdy I really am. One of the announcements out of Ignite recently that I was most excited about was ontology capability in fabric.
And why I was excited about that is because context to me, and this is gonna sound a little pithy and high level, but context, and more specifically context is represented in ontologies and knowledge graphs, are to me the unlock between unstructured data and structured data. And if Gartner is right, and they say eighty to ninety percent out there is all unstructured, although we can have a academic discussion about, you know, is XML a structure or not, of course it is. And and I don’t need to get into an academic discussion around Sure. I mean, structured and all that. Yeah. But text files, PDFs, stuff sitting on SharePoint servers, email, Outlook servers, I mean, all of that needs structure while everything we’ve got in relational stores needs context.
To me, the the the the meeting point of both of those worlds is knowledge graphs.
Do you agree? Am I oversimplifying? How do you feel about this?
I think you’re spot on. Look at the end of the day, a connected data estate needs to string together data agnostic of its modality that is needed to serve a certain purpose. The reality of the world is you’ve got the spectrum from structure to semi structured to unstructured that makes up the context for almost every business operation and function in this day and age. And now with AI, it’s more so increasingly multimodal versus not.
Yep. I think that is one secret ingredient which you know well and which we’ve talked about quite a bit which often gets ignored in this mix and which is metadata in addition to data, right? Yeah. Metadata is the diamond.
If data is gold, metadata is the diamond, right? And I think people have to base some intentional focus towards really thinking about how metadata can really help be the glue that strings together all of these different data modalities, you know, in a common parlance that can then be applied for greater context definition, ontology modeling and everything else. And that metadata is not just technical metadata, right? I mean, lot of the initial conversations I used to have over metadata was about table structures and data types and column types and so on, but there’s an entire salon spectrum of metadata that elevates from the technical to the business context and semantics, right?
It could be something as simple as a business term and a glossary that annotates a data asset or data product. All the way through to a more complex mix of relationships, there’s a knowledge graph, all of which are attributions gathered from the spectrum of the data modalities.
And constructing the knowledge graph, which by the way can be further enriched when you also bring in what’s called the work graph, right? Which is in the work graph is more the connection of how people get work done on a day to day basis, the communications, the activities. So what you heard us talk about at Ignite, the WorkIQ dimension which is built on the Microsoft three sixty five graph. When you take that work context and mouse stitch it together with a data context, something magical starts to happen, where data and AI starts to activate in the flow of work, Right? So bringing those worlds together into a more comprehensive knowledge graph that includes both the work and the data intelligence is kind of the the core crux of what we were looking to land and communicate as terms of our vision and direction at Ignite, which recently happened.
Well, what I love what I love about what you just said is is that your business process is what we’re trying to serve here. Right? And we’re trying to optimize business processes, a k a a k the work.
Over the last couple of years, when I’ve had discussions with data leaders about Microsoft’s position in the market, particularly related to Purview and the Fabric both together, Purview as as as the means to met and manage metadata, and Fabric as a way to kind of operationalize and create a single data management plane.
One thing that I’m most excited about with Microsoft’s position in the market is that they not only own, for lack of a better word, it’s not a great word, but they not only own the analytical plane, but the operational plane as well.
And I see a world where whether it is unification of the Dynamics cloud and the Azure cloud, But I do see a world here where you’re able to close the loop.
Right? Where where you and what do I mean by that? Like, a marketer’s dream scenario is is is that they have a hundred percent visibility on attribution. They can close I spend a dollar on marketing and then I get a dollar back in the front door.
With Microsoft, I think you’re uniquely positioned in that if you make a recommendation for a better way to manage data or govern data or structure data or define data, you can trace that eventually one day all the way back into what’s happening in those operational systems.
Am I talking crazy here or are these things that you’re talking about at Microsoft to the degree that you can share?
I think you I think you totally nailed it. Look, the world the data estate is now evolving to what I would probably describe as a translatable data estate. Right? There’s a transactional component, there’s an analytical component, there’s an AI component.
And every one of these edges are both sources of data as well as consumers of data. In fact, even the transactional edge, like so far it’s traditionally just being known as a producer of data. But in reality, there’s something called reverse ETL that’s been talked about in the industry for several ages. But now most transactional operational systems are also getting AI embedded into them, inferencing against broader data signals, including analytical signals to generate intelligence back into the transactional systems.
That loop is coming full circle now, right? It’s no longer a left to right pipeline, it’s a virtuous cycle, right? Operations, transactions to analytics to AI and then the spin wheel continues, right? So that world is definitely coming together and what’s really interesting to see now is, I think a greater appetite from customers, especially with the advent of the age of AI, To do something which I think practitioners like you and I have been trying to evangelize for a while now, but which I think a few years ago was always an uphill discussion, which is the importance of unifying your data estate, right?
Yep. And unifying a data estate doesn’t necessarily have to be a physical sort of like enterprise data lake. It would just be a logical coming together of your end to end data estate into a construct that makes it easy to create connected intelligence and insights. But the optimal conversations that we used to have, you in the MDM context, know me, and at that point in time, the data lake and lake house context to get people to think about unification has now become a natural normal in customer conversations because with AI, the importance of AI needing to have all its rich connected data of pertinence to the use cases is now driving a greater openness to say like, know what, oh my God, we do have to solve this data unification problem, right?
We want to solve this by meeting our estates where we are at but we also realize that in some cases we might have to modernize infrastructure. We might have to do something different to actually get the best unified data served to our AI. The openness has become a lot more, I would say like natural now than what it was a few years ago.
And this is now resulting in a innovation wave that is starting to see no matter which data cloud provider, everyone is trying to solve this hard problem of how can we bring this translated to the state operating on a common underlying data backbone, you know, and while you might have some specialized like engine specific or workload specific engines to serve the needs of high scale operational versus analytical versus real time AI systems, is that at least a way to get to a common data backbone that can serve all those scenarios, right? And that’s becoming more and more real now with the shifts that’s happening in the industry.
I still think they got some distance to get to a place where that’s completely unified. But at least what we’re doing with Fabric and OneLake and all the engines on top is to say like, hey, can we at least get to a place where something like OneLake can become the backbone? On top of which the engines that are contextualized for each of the workloads can operate. You know, and then bring together a stitch of the underlying data estate in a way that has been hard to do in the past.
Right? So we are on that path today. Yeah.
Yes. And what I like about that is well, I I remember when Microsoft launched Fabric, and I had at Gartner, I had been talking about Fabric for a long time. Right? We had kind of conceptualized Fabric and others generally more smart than me, at Microsoft had kind of conceptualized a data fabric. But when you launched it, I was I was really excited because when I thought about it, and then I thought about it some more, what I came to the conclusion in my head now this may not have been a stated strategy of of of fabric, but what I came to in my head was they’re commoditizing persistence.
Right? If you create this this common virtualized, doesn’t have to be physical. Right? It can be sitting in Redshift.
It can be Parquet. It could be, I don’t care, SQL Server, it doesn’t matter. But if you virtualize in essence, virtualize the persistence and make it accessible from anywhere, OLTP, OLAP, doesn’t matter, I don’t care. Right?
It’s one place to go get it all. I was like, That makes ultimate sense to me because as a data practitioner or or even a CIO, do I really, really care that much that it’s sitting in SQL Server or Parquet or whatever? I I don’t think I do.
Yeah. Right? It’s just it’s just I wanna serve these use cases, I wanna support end users, I wanna build dashboards, I wanna be able to run Spark jobs, who who cares what it’s sitting as? Anyway, sorry, we’re we’re nerding out here on on on data on data infrastructure.
Could talk I could talk about that for for a long time. Let’s let’s talk a little bit more about AI. Something that we talked about before we we jumped on the recording was this paradox, for lack of a better word that I’m seeing. And what the paradox is is that Gen AI is becoming ubiquitous.
Everybody is using it. Everybody’s kids are using it to do their homework. Depending on who you talk to, eighty, ninety percent adoption in the desktop. Yet at the same time, we’re hearing all this doom and gloom about ninety five percent of POCs failing, AI bubbles bursting, no ROI, which I doubt, but no ROI, no real productivity gains.
How do you how do you reconcile those two very polar opposites, Karthik? What what do you think about this?
That’s a great topic. Let’s dive into that. So I think there’s two dimensions. First, I think we should start with just unpacking like this whole concept around AI, right?
I think a lot of people talk about this like it just got birth when the age of Gen AI started like three years But the reality is AI has been in the market for quite a while before in different shapes, machine learning, data science, then Gen AI, and now more recently, agentic AI, right? At the end of the day, you take a step back and look at it, at its core, what is it about? It’s about predicting what’s next, right? The machine learning data science days was about predicting what’s next.
Generative AI age is about creating or generating what’s next.
The agent AI is about actioning what is next.
It just so happens that the corpus of data on which the foundational models or the large language models have gotten trained on are much larger than just the estate of any one organization. They bring together the world’s corpus of data, augmented with organization specific data. And then the compute and the scale of intelligence that runs on top of that has just been unmatched in the past relative to where we are today. But at the end of the day, if you verbalize it into common verbs, it is predict what’s next, create what’s next, action what’s next, right?
And AI like any other technology is technology. I think what people really have to understand is the common dimensions that apply to any technology transformation which is your people and your practice and being intentional about the change management that is needed to ensure that you’ve got your people skilled in the technology evolution to make sure that the practices and the business processes are fundamentally being re engineered for this technology paradigm shift. Our building blocks and fundamentals to make sure you get the most out of it.
If you just take AI and try to slap it onto some business process or apps just for the sake of doing it, it might look really cool in a demo, but it may not really yield a sustainable dealable benefits. Right? So I think that intentionality around the actual reimagination, you know, of the fundamental process that you’re trying to impact and improve with AI. The reimagination of the role of the human in that human plus AI loop to make that process really work with the context you need with an understanding of your business as well as to scale your people to be able to focus more on the intellectual activities that they are best positioned to do. While you can take out most of the grunt work and offload back to AI to go help the people scale. That fundamental reimagination of the people roles in the practice is very very foundational to make sure that your AI use cases are going to be successful.
If not, I think you can end up with really cool demos that shine really well in like fifteen minutes of fame type setting. But the moment you try to scale out into production, it’s like you’re run into a lot of challenges, right? Because you need people to operate at scale. You need the processes to run cost efficiently at scale, right? You need states that you may not have predicted when you’re building the POC or the demo in real world production scenarios that need to be handled at scale, right? So for me, always starts with the fundamentals, which is process or, and the process could either be solving a problem, it could be creating a new opportunity, but reimagining that to intentionally think about where AI can help either predict what’s next, create what’s next, or action what’s next.
And then combined with the very essential dimension of bringing the humans and the practitioners in the loop so that the humans are always there to provide the additional context to continually evolve the AI to sharpen it for specific functions and use cases. If that magical, I would say like stitch can be accomplished from the outset, the success of the use cases can become far more pronounced. And these are, I think some of the fundamental patterns that I’ve seen some really, really amazing customers and folks who figured out how to go get the actual value out of AI. They do this intentionally and that’s the big difference between that intentionality versus I think some of us others who struggle more with trying to just get out of the POC phase.
So to perhaps repeat what you just said in a different way.
Real what I heard you say is the real value will come from reengineering and and re envisioning and reengineering the things that we’ve always done. So if you are just trying to make the horse go faster and you’re not thinking about the internal combustion engine, you are gonna run into those problems. If you’re just thinking about automating processes that exist today, perhaps that’s not really gonna drive the transformational value. It all suggests that maybe we need to be a little more creative than we have in the past.
As as a head of a go to market function, how are you helping clients reimagine and re envision and rethink
How they do their business or rethink how they do their how they manage their data, for example? Rethink how they do governance and how they do data quality?
What what role is is helping your clients kind of reimagine this stuff Key to your success?
Well, that’s a great one. I mean, in fact, our customers are increasingly demanding of us to, when they come to the table, not to come and talk to them about our products and our features. Right? They want us to start the conversation with like, hey, can you explain to me knowing what you know about my business and my context?
Like, how could I be applying AI, you know, in a way that’s truly going to differentiate? And differentiate in a durable manner, right? Because the durability is not just about the coolness of the tech, it’s also about the ROI that needs to be gargantuan and positive for it to make an impact. So think the best way to talk about this is to just walk through some customer examples, know, this way we can illustrate it and show like what true transformation means, right?
So one great example that comes to mind for instance is a leading global marketing automation CXM company, right? And what they really do for their customers is to help them automate their customer lifecycle marketing automation. And lifecycle marketing, it’s a very complex undertaking because this is almost like trying to say like a brand can predict the next best action for every one of their customers and can personalize the most appropriate communication and deliver it at the right point in time in the right channel, right? Traditionally, the way this is executed is marketers having a hypothesis for what a linear journey path looks like, configuring these journey maps and then attaching content and messages that need to get delivered to customers in specific phases of these journey stages.
And then pretty much orchestrating a sequential workflow type journey map that goes from, hey, expiration to trial to purchase to ongoing use.
But as you and I well know as customers of products ourselves, we know that there’s no journey path that’s linear ever. Every customer takes a non linear journey path, you know, and that our customers would like to self discover, that our customers would like to be ghost, that our customers who completely explore your products and sequences that you could not have imagined. At some point, the scale at which a marketer can think through all the journey paths becomes almost unfathomable to try and solve through just manual human constructed workflows, right?
And if you took a process like that, you know, a primitive example of where you could apply AI could be like, hey, if I can just apply AI to help create the next best content to attach to an action in a sequential workflow, that’s going to give you some gains in the sense that okay, the workflow or the journey map of the customer is still sequential, but that AI is helping me is to bring in some the best content for a certain customer demographic and profile and for a certain phase of the lifecycle. But if you take this to the next level and I’ll say like, you know what? I really can’t predict the volume and scale and the permutations of journey paths for my various customers.
But what I as a marketer know is if a customer is in state x, here’s the next best action. I do not know how they’re going to get to state X. They might get to state X through a linear path, through a non linear path, through various pathways. Now imagine the power where you can bring AI in to now just take the marketer, the human inputs on what are the outcomes, what’s the best content to achieve an outcome and can take that and then can dynamically ingest signals about the various customer navigation states from telemetry and can now automatically like create that personalized journey navigation for each customer based on those signals.
And based on the marketers inputs of what’s the best content to deliver to drive an outcome based on the customer’s state, can then serve up this hyper personalized one on one communication in various channels. Taking the marketer out of the guesswork of needing to figure out all those permutations.
But then still leaning on the human expert with the product and the domain context to provide the inputs that can help channel the right content delivery at the right time. That is one example of where we sat down with this customer and actually fundamentally reimagined the marketing life cycle process, you know, and brought in AI not just as a slap on to say like, hey, AI can help me generate better content.
To AI can fundamentally change the game in terms of how I achieve hyper personalized one on one communications by building on the marketers IQ, but also scaling the marketer where the marketer cannot scale as a human. Right?
So that’s one example of such a of such a transformation.
Yeah. With real time feedback loops.
Yeah. Right? That that are that are coming out of your analytics stack that are going back to the AI and saying, this is working and this is not working. Like, that’s that’s that is the kind of, to me, not only the process of reengineering, but also the feedback loops.
Like, time is is some of the eureka stuff here because, you know, as a product person, you’d have to wait. You’d launch something, you’d go to market, you’d you’d implement your go to market program, and you’d have to wait for the market for the feedback to come, and that’s not how it is anymore. And now you’ll have this automation that you just talked about, which is okay, if this didn’t work, then try this the next time around, or try a different offer. Instead of fifteen percent off, go to thirty whatever it is.
So all of those things.
But in that world, you know, the the the people the role does change a little bit. So so what is your perspective? Mean, you touched on this a little bit briefly before. What’s your perspective on the kind of retooling, reeducation, kind of pivoting of some of our traditional roles to be more of a guide, to be more of a business consultant, to be more of a business optimizer than just a data manager? How would you suggest somebody help kind of make that shift in their career?
A hundred percent. Now roles are going to evolve and there’s going be two types of evolutions. Right? One evolution is an existing function evolving to become now an AI enabled or AI scale function.
And there are also going to be some functions where the need for the humans in the role might genuinely diminish because of the task itself being something that can be highly scaled and automated through AI. I think we as professionals have a great opportunity to look inward, to look at our roles, to understand the things that we do uniquely, which truly, truly bring out the the IQ and the creativity in us relative to the things that are more, I would say, like mechanical and can genuinely be automated and scaled with like the power of AI, right? I truly think that each of us have the power in our hands to redefine what our role should be, right?
The thing that I commonly run into when mentoring folks is a lot of people come to me and like talk about, oh my God, I’m not sure where my job is going go, my role is going to go, you know, and they’re almost like sitting and waiting for some force above. You know, whether a senior leader or some organization to come and set the pace for where they should be going. And the simplest advice I can give is like, rather than waiting for that to happen, like take control of your own path, right? In fact, the leaders in most organizations, this is certainly the case of Microsoft, you know, are looking for the people to actually step up and say like, hey, like how should we be thinking about reimagining the next set of roles, Right?
And one thing from a leadership perspective is the more leaders can, I would say, engage their folks in that dialogue, you know, so that they can collaboratively together define what that evolution needs to look like? The greater you’re going to get the buy in from people and the less of psychological unsafety that you will have to navigate as a leader. From a people perspective, there’s got to be more ownership and accountability to say like, hey, the leader in the organization doesn’t have all the answers. How can we step up and show where it can be completely beneficial to the business to show how we can truly do what we do uniquely best and then scale with AI in areas where we can and should?
So to me it’s a two way street, right? And when you go through that exercise a couple of things are going to happen. Existing roles will evolve.
Some roles may not exist. But the roles that do not exist don’t come as a surprise because they come as an intentional evolution with both the leadership and the people working together to navigate the revolution. That way even giving the people the opportunity to prepare and blend into the revolution as opposed to feeling like the rug was shifted from under their feet, right? So it’s a dual sided accountability.
It’s not a one way street, right? Leadership has a role to play, people have a role to play and then, but the psychological safety to do that together with the vulnerability to accept that no one knows all the answers, but also challenging and pushing the status quo to say, you know what? The change is here. If I’ve chosen to be in this industry, I need to evolve.
Change is inevitable. Now let’s partner and figure out how do we make this happen for the best of us and for the best of the company.
Yeah. Put another way, embrace a growth mindset. I mean, yeah, that’s it’s I I don’t know if have you ever read Satya’s book? Oh, yes.
Yes. Yes. I mean, refresh. Right. Satya is hit refresh. He’s all about a growth mindset.
He he he he I mean, he mentions it multiple times in his book, and what you just said is disrupt yourself before you are disrupted.
Hundred percent.
Hundred percent. But but as a leader, we need we need to create safe spaces. Right? We need to we we need to allow people to have the creativity and the risk taking to to do that. We need to give the space to do that.
But at the same time as the individual, what you’re saying is, you know, you’ve gotta do it. Right? You’ve got to find ways to disrupt yourself. And when you do that, and when you take control, I mean, that’s really ultimately going to provide you the most freedom in the long term.
I I I would beautifully said this.
Yeah. Yeah. Could agree more.
I I have obviously embraced I’m trying to embrace a growth mindset. My book is entirely about that. But anyway, let’s talk let’s put our magic forward looking goggles on because it’s by the time people watch this, it’s probably January, and this is the time of year when we start to look forward and say, here are all the things coming. Now, again, to the degree that you’re comfortable, because you’re a senior leader at Microsoft talking about future looking stuff, I’m giving you a bit of a safe harbor here.
What do you think is coming around the corner for the big things in twenty twenty six? What should we as data leaders be thinking about and trying to get ahead of the curve on?
Great, great. Let’s talk about this. And I would like to pick your brain on this as well as we go through One thing that’s top of mind for me is we talk a lot about data management for AI, right? Yep. I would love to see more happen on the flip, which is AI for data and data management. If look at some of the most hardest data management challenges, whether it’s fast data management, it’s data quality management, whether it’s data observability, there is as much of an opportunity to apply AI to scale the fundamentals of those data management workloads.
And doing that in a way responsible way you can also have that human in the loop experience where the domain and the subject expertise of data stewards and other data management professionals can get infused to achieve what we just talked about, right? Which is this magical human plus AI, one plus one, greater than three equation in the world of data management. It’s something that I would personally love to see happen, right? And we’re starting to take some steps towards that at Microsoft as well as together with our partners and we’re hoping to see more of that happen.
There’s another big one which we talked about earlier which is context, right? Context and like how do you model context around your data estate, right? And recently had a phenomenal opportunity to speak with one of our customers and this is actually an SMB customer, a small medium business customer, right? But they’re like, hey, look, we don’t have a large data team. You know, don’t have a trained data professionals and data management experts. Our dream state would be where we could apply something like AI, you know, to go and like look at our data estate and where we can feed us some documents about our business which has context on our business.
And whether AI can just go in there and model out my ontologies, model out my knowledge graphs, map it on top of my data estate and give me the full lens of view, you know, where my users can now just go get productive. Of course, my human users will then continue to refine and optimize and feed more signals and everything else. But I don’t have the energy and the time to go bootstrap all of that from scratch. It’s like, how are you guys thinking about putting AI to task on that?
Right? So this whole notion of context modeling, context mapping, and scaling that together with AI, I think is a phenomenal opportunity for us to also think about and go after. Right? And then the third big one for me is none of this is actually going to work if you can operationally scale.
Right? I mean, like, you’ve got to be able to make sure and now think about it this way. I think someone put it really well on one of the the LinkedIn posts, which is, in the world of AI, not done well, darkness could speed, darkness could flow at the speed of light.
And that’s actually very true, right? For those of us who’ve been in like operation centers and in the middle of life side operations and on call duties, It’s hard enough to keep pace with life site incidents when just humans are engineering and running the systems. I know you’ve got agents doing that at a scale that is really hard to match. You know, if things start going south, you know, the flip switches and the kill switches as well as the mitigation processes could become really, really challenging.
And where I’m going with this is the world of observability, right? Like observability, making sure the guardrails are on, making sure that even though you want to bring in agents and help scale the workloads that you’ve also got agents who can counterbalance and make sure that when things start going wrong, the right actions are being taken and humans are being brought into the loop in a timely manner. So this whole notion of how AI can be applied to the world of observability to scale operations to keep AI safe, think is another great, I would say like virtuous cycle moment that we have to think hard about, Right? So, I’m personally excited about, I think, where we can go with now flipping the grid from data for AI to AI for data, and seeing what we can accomplish there.
Yeah, I completely agree across all three.
And, they’re all gonna be focus areas for me in twenty twenty six for sure.
I I think our biggest challenge is really rethinking our processes. Right? What I’ve been saying for the last two years, three years, is that the world of BI, the foundations of BI are are not the foundations we need for AI.
Right? We need to rethink how we’ve always done data quality. We need to rethink how we’ve done data governance. And at the core of it, it is that context centricity.
Right? And and and when you embrace context centricity, when you embrace the idea that not all data is created equally, you’re more naturally warm to the idea of allowing AI to play a stronger role when it comes to allowing you to scale. Right? Because where we need to get to necessarily is instead of one set of governance policies to rule them all, which we know don’t work, we should we’ve always known that doesn’t really work.
But the idea of having n sets of governance policies, right, n sets of definitions, n sets of quality standards, that’s a lot of work. However, that’s where AI can help. Exactly. Right?
If we embrace the idea that data quality is data that is fit for purpose, well, what can AI do really well? It can do pattern recognition really well. It will know when the data is fit for purpose and when the data is not fit for purpose. That’s kind of just one way of thinking about data quality.
Instead of this deterministic rules driven, here are the rules, look at the transactional data. Figure out when processes are working and when they’re not working, and feed that stuff into LLM so that they can look and find anomalies and find patterns, build out causal AI models where you know what is driving a marketing conversion, when you know what’s not driving it, and when data plays a role, when it doesn’t play a role. Those are kind of things I think that I mean, like, huge opportunities for us in in twenty twenty six. But at the at the core of it, kinda gets back to that idea of being willing to take some risks, thinking about things differently.
And, you know, that’s that’s tough for a lot of us who are, you know, are kind of, you know, born and bred in this deterministic rules driven world. So, it’s gonna be it’s gonna come down to people, I think, like you, Karthik, who are leaders in the market, are able to say loudly, rethink things. You know? Don’t be afraid to take some risks.
Go break some stuff. We don’t like to break things because, you know, we get yelled at when things break.
Yep.
We’re gonna have to open ourselves, I think, to some of those possibilities.
Hundred percent. Super awesome. Yeah.
Yeah. Well, with that, Garthik, I will let you go. It’s Friday afternoon here on the East Coast, Friday midday on the West Coast where you are. Thank you so much, my friend. I this is a wonderful conversation. I could talk to you for Absolutely.
Likewise, Michael. Appreciate the opportunity. Yeah. Thanks a lot. Alright.
Thank thank thank you. And to our listeners, thank you for for listening. Thank you for listening in twenty twenty five. Look forward to having a great year with you in twenty twenty six. Take a moment to subscribe. Take a moment to like all of that stuff.
With that, I will see you on another episode of CEO Matters sometimes very soon. Thanks everybody. Happy New Year.
ABOUT THE SHOW