Understanding AI's Role in FinOps

Understanding AI's Role in FinOps
FinOps Weekly Podcast
Understanding AI's Role in FinOps

Sep 20 2025 | 00:36:08

/
Episode 13 September 20, 2025 00:36:08

Hosted By

Damian Munafo Victor Garcia

Show Notes

Learn how AI tools can enhance cost allocation, automate repetitive tasks, and create actionable insights, leading to more efficient and effective financial management.

We explore how AI is revolutionizing financial operations (FinOps) with guest Ben Schechter, CEO of Vantage. Ben explains the intricacies of cloud cost management, the importance of Model Context Protocol (MCP), and how leveraging AI can optimize FinOps practices.

Chapters

  • (00:00:00) - Introduction
  • (00:03:23) - AI's Impact on FinOps
  • (00:05:26) - Leveraging AI in FinOps Practice
  • (00:12:00) - Understanding Model Context Protocol (MCP)
  • (00:15:06) - Core Use Cases and Dataset Allocation
  • (00:17:46) - Building APIs and Developing MCP
  • (00:19:28) - Small Companies vs. Enterprises
  • (00:22:28) - Tagging Hygiene and Cost Allocation
  • (00:28:53) - Future of FinOps and AI Integration
  • (00:35:15) - Concluding Thoughts and Farewell
View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Hi everyone and welcome to a new episode of the FinOps weekly podcast. Today we are going to dive into the AI world. How it's affecting the FinOps practice, all these passwords that have been related to AI and how you can control the cost of your AI in the cloud and how you can improve the management of AI workloads and how you can leverage the AI for, for your finance practice. And for doing that we have the help of Ben Shetter, the CEO of Vantage. How are you Ben? [00:00:35] Speaker B: I'm doing well. It's good to be chatting again. Thank you for having me. [00:00:40] Speaker A: It's a pleasure to chat again. It was great to meet you. In the session that we had this summer, I think people got a lot from it related to mcp. If you haven't go take a look, I think the link will be in the description. And for helping me also dive into the topic, we have my co host, Damian Damien. How are you doing? [00:01:01] Speaker C: Hi, doing great, very good to be here and finally met Ben. Hi Ben, how are you? [00:01:08] Speaker B: I'm doing well, yeah. Good to connect, good to meet. Looking forward to the conversation. [00:01:13] Speaker A: Awesome. So yeah, let's start things off with very, you know, basic questions so people can figure out who you are, what your, your company is doing. So tell us a bit more about, about Vantage. What do you, what do you guys do? What's the mission of the company? Since you are, you know, the founding members, you probably have this message super clear. So please tell us a bit more about you and, and the company. [00:01:38] Speaker B: Yeah, for sure. So thank you everyone for listening in. My name is Ben Schechter, I'm co founder and CEO of a company named Vantage. Website is Vantage Sh. You can go and you know, take a look there in Super Simplist are a cloud cost management platform or a FinOps platform predominantly used by modern engineering teams to manage not only primary cloud costs like AWS, Azure and GCP, but we have support for about 20 other providers and our hope is to be as full of a representation for anything that contributes to usage or consumption based pricing. And this is driven by other providers like Datadog, Databricks, OpenAI, which will probably be a bit of a topic that we'll talk about today. And yeah, that's like a pretty good, a pretty good high level overview. We have over 15,000 organizations using Vantage globally used. Everything from individuals all the way up to Fortune 500 Enterprises. So I'm sure we'll dig a little bit deeper. But that's the company and then just background on myself prior to Vantage. I worked at Amazon Web Services and DigitalOcean as a technical product manager on container services and compute. So the joke that I make is that I was responsible for driving costs for a number of years and now trying to help redeem myself on the Vantage side by helping customers actually manage those costs. So that's a brief background on the company and myself. [00:03:09] Speaker A: Good deal, good deal. I think. Yeah, you are now on the good side of things. Right. So we'll talk about more about this and the company, but just to start kicking things off. So you know, AI and is the hot topic, I think for the year will probably be the hot topic of the decade, especially in the technology space. So if you can help the audience, you know, demystify a bit the AI topic for, especially for the FinOps audience, what do you think are the key topics to handle? What are the things that people should take into account when referring to AI? Especially focus on the cost side. [00:03:50] Speaker B: Yeah, so I'll give you kind of the two categories of what might be relevant for FinOps practitioners or people who are in the cloud cost management space as it relates to AI. And there are the two different facets of it. The first is how can we make sure that we have coverage for all these new providers that the organization is using. That's driving costs from an AI perspective. And so you know, in the past organizations were using aws, Azure, gcp and increasingly a large portion of these costs are being driven by just net new providers with their own bespoke token based pricing structure. And the obvious ones that you'll hear about are chatgpt from OpenAI Claude by Anthropic. But there's a long list of other AI specific providers that as time goes on are just becoming a higher compensation or higher concentration of, of overall cloud spend. And so for your finops practice you'll want to make sure that you're aware of those costs, that you have proper visibility on those and have the tools you need for, you know, allocation and remediation, everything along those lines. So that's, that's kind of one facet. The second facet is actually one of the more interesting ones which is how can you leverage AI in your finops practice? And that's where unrelated to all the tracking of costs. How can you enable your organization, knowing that everyone is using, you know, these LMS or ChatGPT or Claude or Cursor to actually accelerate your FinOps practice by leveraging AI? And we'll, we'll probably get into this a little bit into, into this conversation. But increasingly vendors and market and organizations are exposing this data specifically to give LLMs context about finops cost data so that you can move a little bit quicker with a number of things that you might be doing manually today. And I'm sure we'll get into some of those examples. But long story short, how can you leverage LLMs in your organization to enable more people to access that information and complete finops workflows from an LLM perspective? [00:06:01] Speaker A: That's a key topic. I think the two separate and I think it's always like a duality with AI that the finops for AI, AI for finos on how you can help control the goal of itself or how it can help you manage your PH practice. And you know, regarding this topic about, you know, AI being able to help you in the phenotype practice, where do you think it has the most impact or what are the areas that AI can help the most to a finos practitioner or to someone that is controlling costing, especially in the cloud? [00:06:42] Speaker B: Yeah, I'll tell you so just as background, I'll give you kind of some perspectives that we have from our customer base and what we're seeing and what we've done to even get some of that information. So just as background, I think in the last probably 2 to 3 ish months, if you're listening to this podcast, as recently as it's published, but I'll say in 2025 we launched both a local and hosted MCP server and if you aren't familiar with that, we can get into it here in a bit. But basically in super simplistic terms, it exposes cost data or data about any of the providers that you use to these LLMs like ChatGPT or Claude. And the folks that we largely interact with are the people who are responsible for the cloud. Bill, it's the finops practitioners, it's people in engineering. But one of the core kind of goals for the organization is, as we all know, FinOps is kind of like an organizational change or a cultural change. And part of that is just getting people to engage with the cost data, engage with these platforms. And where before organizations were, you know, going into a business intelligence tool and for vantage might be like logging into the console. And over time people want to access that information programmatically. Now the, the new kind of facet that people expect is to be able to interact with that cost data from an LLM perspective. The nice thing is that you can drive a lot more adoption in the organization because people who might not be sophisticated enough to log into a FinOps tool or educate themselves on a new system. They can just interact with their cost data and ask a model questions and just get their answers directly in a conversational way. And so what we're finding is actually like there's driving more adoption of FinOps in the organization because people are a lot more comfortable asking questions to a computer than they are to their finops counterpart and having that person have to go and do the work. And these workflows can just be automated. So there's a number of things that are just like basic questions can be answered through lm. So what's our savings plan coverage? How are my costs trending on a unit basis for this service or this project or this team? How am I, you know, trending towards budget? All of those things can be done via, via LM now. And then some of the other kind of interesting things that are opening up is organizations being able to combine that cost data through the LM with other, with other tools that they might use. So for example, if I'm using Google Drive or Notion, I can say, hey chatgpt, can you just create a report that I would normally do weekly manually, take the information from these four or five reports that I have in Vantage and programmatically create them in, you know, Google Drive or Notion to share with the broader organization. So it's kind of accelerating some of the manual workflows now as well. [00:09:47] Speaker A: Yeah, I think the MCP part and correct me and of course you can add it on that man, is one of the most impactful thing because the way that it interacts with the different topics, being able to do it on natural language is critical, right Damien? [00:10:05] Speaker C: It is definitely. We see this kind of like AI came up, you know, accelerated in a pace that I don't think that every anyone expected that it will be so fast. And everybody seems to be trying to adopt and we see that a lot of information and you know, learning is missing and it's good that we, we have these kind of things that, you know, then we can share information here in the podcast and in other places also. So a little bit in vintage that you always guys putting their information there for everyone to see, not only on AI, but different things and share with the community, which is very good. And yes, I think that this is a very good direction to be taking on and I would love to see more initiatives in this also like trying to. There are always new services and new things and the moment that we will use and integrate these kind of models into the Day to day work, hoping to see those models identify new things, right. New areas and where we can optimize and look for those and help us on the day to day and also keep us very focused and doing more of the cultural other stuff that the day to day is very painful for us. Also, you know, the communication between all the teams and there is a lot of work that you know, is being thrown to the, to the finops. The more that we go and integrate these models, it's definitely going to help us be more efficient. [00:11:57] Speaker A: Yeah, for sure. And you know, diving more into, you know, the McP and the AI topic is like so for you know, a phenoms platform like yours and for a phenoms tool like that, what are the key elements for being able to use AI and mcps to be able to create that type of work that you mentioned about, summarizing, creating reports, being able to interact. So how do you allow the AI to interact with that data in your case? [00:12:31] Speaker B: Yeah, good question. I think it might be we dove into things, it might be good for listeners who are not familiar with MCP just for me to give a high level overview on what that is, if you're not familiar how you can leverage it and what it ultimately means. And this, this whole space is developing so quickly, but I think in another 3, 6 months this will probably be so obvious that you know, it's an expectation that you have MCP support. So I'll go ahead and try and give like a high level primer on what MCP is and then also like how any SaaS provider like Vantage can expose data through MCP. And you know, it's not something that's specific to the FinOps world, but it's something that the FinOps world is quickly adopting. So just as like a super brief overview. So mcp, if you aren't familiar, it stands for Model Context Protocol. It's essentially the interface or the protocol for which you can provide data or context to a large language model in a large language model. An example of that might be chatgpt which I just mentioned because that seems to be the most popular one. But it can also include things like Claude by Anthropic or own custom models. Essentially what needs to happen is that MCP needs to be supported by a large language model. And if you go and Google mcp, this is a protocol that was actually developed at Anthropic and it's just recently been announced that it's being adopted by OpenAI and there's kind of like a beta Support for MCP at ChatGPT. So what does this actually mean? So as someone, let's say you're at an organization, your organization is using ChatGPT, you want your employees to be able to ask questions about their cloud cost data. You can go to your FinOps vendor of choice. Obviously I'm very biased because I co found Advantage and say, hey, we'd like to have an integration into Chat GPT for allowing our employees to ask questions about their cost data. So the way that that actually works is these days it's very simple. If you go to ChatGPT, there's a little beta interface you can say, I want to integrate with Vantage. It just goes through OAuth and suddenly your large language model has context. And as you ask questions about your AWS bills, your cloud cost bills, the large language model will know, I need to go and query Vantage through this MCP protocol and gather the data to be able to answer questions. So it's kind of magical. And if you, if you've seen the movie the Matrix, you know when Neo is like trying to like learn something new and they like say, hey, I want to know kung fu or I want to learn how to, you know, fly a helicopter. MCP is kind of like that, where it's like, I want my large language model, just be aware of like my cost data. I can plug it in and suddenly it has access to all of that information. The, the foundation, if you are, if you want your, your Vendor to support MCP, the two things they need and as it relates to FinOps is basically they need all of the data structured in a, in a really like normalized way. And all of the things from a FinOps perspective still apply. Where you want to have really good organization of your cost data, you want to do virtual tagging to make sure it's allocated. And then once all of that work is done, and it's kind of an ongoing, consistent thing that, that all FinOps practitioners are doing, you'll want to make sure that you have APIs to expose that data. And then the MCP is largely wrapping those APIs. And it's kind of like a set of prompts that trains the large language model on which APIs to call for which kind of workflows. And so that's kind of the, when we talked before, there's a, there's an image that I typically share that's like Maslow's hierarchy of AI needs for mcp. And it's like the base layer is data. The second layer is APIs then you have MCP and then everything else is kind of available from there. So that's like a super high level overview on MCP, how it can relate to a FinOps vendor. And Victor, I may have gone off course there a little bit, but I think that's good context and I can relate it back to the, to the finops topic as well for sure. [00:16:46] Speaker A: And of course if someone wants a deeper dive on the mcps, as we mentioned before, we have the, well the session the master class that bendy the the past months on the. On the topic. So if you have any doubts, I think it. You can check it out because it was very complete and very specific to the mcps but evolving about you know, the AI interaction and, and that's. I saw you mentioned about how what a tool needs to be fulfilling for being able to extend those capabilities. So do you think that it's possible to handle all of these MCP or all this data to ingest today using building an in house tool, building a SaaS tool internally in your company so that you can interact with these MCPs? Do you think that a third party is required to be able to interact properly with the, with the cost data or what. What's your perspective? Maybe for you can put an example for a small company and for a large super huge company that is more complex. [00:17:56] Speaker B: Yeah, totally. Yeah. And if you look at my background, I'm not, I'm not a salesperson and I always try and like talk about things from like you know, the, the customer problem. I can give you the pros and cons of, you know, can you do this in house or can you develop this yourself? And then you know, should you use like an outside vendor or something along those lines? I would say as with everything MCP is included in that you can develop this in house or roll something locally yourself and expose all of this to a large language model if your heart so desires. And essentially going back to what we were talking about there, you would still need to complete those core use cases. You would still need a data set, you would need to allocate that data set, you need to build those APIs then ultimately you need to dig into developing your own mcp. Nothing prevents an organization from doing that. I am sure that there are large enterprises that may decide to go that route or if they have a team that can staffed on that and is knowledgeable about it. The only downside there is as with everything you have to put a level of investment in to hire the people that know how to do that make sure that it's maintained and then, you know, on an ongoing basis make sure the functionality is just kind of kept up to date. The benefit of using an outside vendor is, you know, we have expertise on a lot of that stuff and we do have like a, you know, a switch you can flip and all of that is available to you kind of right off the bat. In terms of your question about the stage of the company, I think what we'll see is like smaller companies will probably do some hobbyist work on this. Like, you know, I want to dig into my own career. I want to learn about mcp, I want to, you know, be able to kind of be exposed to this. And I think actually on the enterprise side we'll see a lot more enterprises looking to kind of go and buy a solution to get something in place relatively quickly. Just because there's so many initiatives right now where it's about how can I enable my organization to be AI ready or to accelerate some of the work that they're already doing and the money spent on a vendor there is de minimis relative to how quickly you can have your organization move. And also just like scale of data and what you need to have exposed from an API perspective to a large enterprise. And in general I think it's good to just have that outsourced. That being said, I'm sure there will be some mix and match on both. There'll be small companies that want to use a hosted solution. There'll be enterprises that want to develop their own thing in house. And I think it just depends upon the nature of the company and how quickly they want to adopt an AI workflow. [00:20:35] Speaker A: Yeah, I think speed is always the topic, right Damien? [00:20:40] Speaker C: Yeah, the speed is definitely one of the topics. And if we look into, you know, again where the industry or where you know, the, some of the foundations or you know, the phenops practitioners are taking adding to this like focus, direction. Most of the things that we are looking is more like it's kind of being a standard, standard design then I would expect also this part as well. Again like we're saying we want to be fast, has to be efficient. I like the direction which you have like a tool and switch that and you have this NCP and everything. I think that at least in my view I would focus on that and use my resources for other things. Of course, you know, there is always innovation to try things. But again, I believe the moment that we are going to stand out, it will be most easy to have, you know, take something off the shelf and, and yeah, for sure. [00:21:46] Speaker A: And you know, that's, that's a very good insight and I want you. Today we've been to. Okay, so we have now decided the capabilities. Now we have found like where should I buy or not the, the solution? Where should I be in house? So apart from what you mentioned before about the, you know, for example, creating the report about your cost data, what do you think are the tasks where it will really help making or having a tool either building or third party do like that in phenoms? Like for example cost allocation, multi cloud unification, of course, interaction with SaaS. What are your examples in your case that maybe your customers have used the, the tool? [00:22:35] Speaker B: Yeah, so there's, there's a couple like really interesting use cases I can give to specifically the finops world that I think might like open people's minds to the things that you can have done. So I'll, I'll give one really interesting example right off the bat. And this is usually something that we hear kind of across the board from our customer base, which is like we have really poor tagging hygiene. There's all of this, there's non conformity for uppercase, lowercase. Every single team is spinning up a new tag, you know, left and right. We want to merge all these costs together into a virtual tag or our own like showback or chargeback cost allocation rules. And that entire process is super, super time consuming. You have to go and get the underlying cost and usage report. You have to kind of audit all of the cost allocation tags the organization is using. You have to put into place some like filtering logic to merge those into virtual tags. And with AI, what a number of our practitioners are doing, actually I heard this from a few different customers. They're like, I'm using the Vantage MCP just to go and automatically profile all of the cost allocation tag keys and values I have from the underlying providers and do all of the work that I would normally have to do manually for grouping all of those tags together by a business unit or by a function or, or by a service internally. And this is something that would take days to weeks for a FinOps practitioner to go and do make sure that there's like, you know, a corresponding campaign internally to go make sure that you have like full, full coverage. And now AI is doing it like nearly instantaneously. Two of the APIs that we make available to our, to our MCP are around tagging specifically for this use case. And so it's like we're, you can get information that's not necessarily just raw cost data, but other context about what's in your cloud environment. And then automate the creation of all these virtual tags, automate the creation for all your cost allocation rules. And I would say that's an example where you're just getting massive, massive acceleration from a finops perspective. And if people are familiar with that crawl, walk, run maturity model, there are certain levels for what you want for cost allocation coverage. And AI can help you move up that maturity model very, very quickly. So that's one really interesting idea. There's a couple other things that we hear from customers for just like ongoing tasks that they would have to do on a weekly or monthly basis where we mentioned on this or we touched on this a little bit before, but it's combining MCPS to do some of their work. And the combination of MCPS or tools that I'm hearing about are like Vantage with Notion, Vantage with Google Drive and then Vantage with Linear, which is like, kind of like jira, if you're not familiar. But for that last part it'll be like, hey, we have an initiative to decommission this region in aws, do a migration from this region to the other region. Can you go and create a series of linear tasks or JIRA tasks with the corresponding cost impact for my team to actually action on. And again, this is something where normally you would have to go and get a whole bunch of data. You would have to actually identify these things yourself. Now you can literally throw that prompt at an lm. It will just go and pull the corresponding data from Vantage. It will give you something to review and then it can automate the creation of these reports either in Notion, Linear, Google Drive. And it's where you're just kind of orchestrating the work you would normally have to do manually in seconds. And it does kind of feel magical. And I do kind of think this is where the world is going, where it's just going to become table stakes expectations that finops practitioners have like a little AI copilot or helper with a lot of their tasks going forward. So those are two of the more interesting things that I've heard about from our customer base. [00:26:39] Speaker A: Those are two really good use cases definitely like being able to create reports in an easy way that easy to understand and gathering all the insights quickly are probably one of because you the problem that we normally see it as well and probably using your customers like we have the data and there's the raw data and you need to translate it to insights to your leadership to your management so that you can actually take action on something because you know your CTO cannot get to the core file and see what's going on. It's like you need to get the insight for, for her or for him and then being able to provide like okay, this is the action item, this is what's going on based on this data, right? And the AI can quickly get insights even though the numbers are not the strongest point of that. But the summarization like being able to compute that and being able to get to the right point is one of the most powerful things. And also being able to get the help and create the tickets and all this manual work that we see like there are clicks here and that and change of context and all of that. Instead you are just asking an interface to do that and then it happens and you don't have to, you know, lose your focus because you are still on the work progress and I think the productivity will skyrocket on, on that for, for sure. [00:28:00] Speaker B: I agree. And I think the other thing too is that we've largely been talking about like read only tasks with you know, things that are accomplished in the LM and I think soon people will also have the LLM perform tasks for them within, within the vendors themselves too. So hey in Vantage I need to go and create these five or six reports for management to review versus just like pulling cost data out into other systems and so kind of early days on that and we only have a small set of things that you could do from a write perspective through MCP right now but it'll be something that we expand on more and I'm sure there'll be a bunch of use cases there as well. [00:28:41] Speaker A: It would be good to see. It's always tricky to get them to work perfectly when you're doing write operations because there is a lot of things in place but I'm sure you'll get through it. And talking about a little bit about the future it's like how do you envision the FinOps ecosystem evolving and how do you see NCPS affecting in the next few years finops? Do you think that there will be impact from other areas apart from cloud as we are seeing finops transition into a more hybrid model? What are your thoughts for the future with this topic? [00:29:22] Speaker B: Yeah, that's a good open ended question. I like it. I'll give a few different thoughts. Our belief so just as background on vantage we support 20 different provider, over 20 different providers today. And that was actually that belief was there before the AI wave that Increasingly organizations are going to use more and more providers. You know, five to ten years ago you would only use AWS or you would only use Azure, or you would only use gcp. And you know, we're kind of already seeing this maturation of really big third parties like Datadog or Snowflake or Databricks that are now, you know, kind of a staple of like an overall cloud footprint. And I think AI is actually accelerating the original vision for Vantage, which is that organizations are increasingly going to use way more third party providers and each one of those providers has its own bespoke usage based pricing component that actually makes finops a little bit more relevant and harder just because there's this proliferation of usage or consumption based pricing across all of them. The tilt that we have now, and it's happening very quickly, is there are so many things that we used to think about in terms of like virtual machines or like Datadog hosts or things along those lines. And now the nature of the conversation is that this is all around like model usage or token usage and things along those lines that are coming from a whole suite of other providers. And so I think in terms of where the market is going, I think that we're going to see a lot more companies using way more providers and a larger portion of them are going to be AI. I think a larger portion of things that impact cost of goods sold or overall margins at companies are going to be AI related instead of, you know, primary cloud related where we're already seeing, you know, in certain cases customers cogs are comprised 5, 10, 15% of OpenAI costs now, which just didn't exist, you know, not too long ago. And then I think on the practitioner side, like in terms of how you can use AI for your FinOps practice, like we were talking about, largely like what's happening from just a, what's driving cost perspective. But I think like people are going to increasingly use LLMs and MCP or whether it's MCP or some other protocol as like the default interface for interacting with FinOps environments. Cloud cost data, using that as a way to get jobs done first. Whereas before people were doing point and click or API calls that'll feel just like really manual in a couple years. And so I think I'm obviously biased again, but we wouldn't do this if we didn't believe it. I think you should have an expectation that your FinOps vendor has an AI story. If they don't, I think it will be left behind pretty quickly. And I think that the more that These vendors map to where the market is going, which my belief is like ChatGPT and Anthropic have kind of won. And if they're integrating MCP, you want your FinOps practice, whether it's in house or through a vendor to support that along the way. So those are kind of some, you know, some, some general beliefs and then, you know, Victor, I'm sure we'll, we'll touch base again in three months. The whole space will have changed because it's moving so quickly, but that's at least as far as the eye can see. And what I'm willing to go on record with saying, so we'll see in a couple years if I was right. [00:32:54] Speaker A: Yeah, for sure. It's, it's a difficult, I think it's a difficult environment to predict on. I think also the AI all the AI agent topic is going to be huge in the next few years and how we handle it as a, I don't know, as a society, as a, as a community, as everything and how it will revolutionize the way we work. I think that will be key as well. And you know, being able to let an AI constantly be able to do the work and consume and how you control that consumption and how do you measure that? It's going to be also a tricky topic apart from ncp, some from AI. So yeah, but those definitely are good insights and good topics. I don't know. Damien, what do you think, what do you think the future will provide for us related to AI and finops? [00:33:45] Speaker C: I'm sometimes when I think with myself about it, I get scared because you have now the AI as an innovation and then people are using the AI to accelerate the innovation as well because you have these great machines helping you and it's accelerating the innovation, accelerating everything. So I don't know, I don't want to, you know, think about it, what will happen in the future. I'm scared about my job sometimes when I think about it. But I believe that at first, like, like we were saying before, I think that at first it will ease our work and we will definitely allow us to focus more on, on the things that, you know, it's more hard for the AI. Like I was saying, you know, I think we talk about it once until we have like a robot that is running between us and talking to us and understand us, we are safe. So up till that point I think that we should, you know, embrace the, embrace the AI and help us in the day to day job. [00:35:01] Speaker A: Indeed, indeed. And I think the moment that you make a test with MCPS and try to to make it work and see how everything interacts. It feels like Ben said, magical. So, again, Ben, it's been a pleasure to have you today. I hope people have learned a lot about AI McPen ops and how it would evolve. And as always, it's a pleasure to have you with us. [00:35:28] Speaker B: Yeah, likewise. No, thank you for the time and for the conversation and looking forward to the next one. [00:35:34] Speaker A: Yeah, for sure. We'll have some talk soon. And looking forward to talk again in the future. Damian, as always, pleasure to have you as well. [00:35:47] Speaker C: Thank you very much for being us here with. Thank you, Victor, for this one. [00:35:53] Speaker A: Yeah. Pleasure to have you both today. And see you on the next episode. Bye. Bye. Sam.

Other Episodes