Being AI-native is not about an AI model or one team. It’s an ecosystem where every stakeholder has a critical role to play.
Source: Freepik.com
As the adoption of Artificial Intelligence (AI) evolves in Indian enterprises, the question of how much of the AI should be built inhouse and where can vendor services be employed poses itself before the CIOs.
According to a recent Massachusetts Institute of Technology (MIT) survey, external partnerships have proved successful twice over, as compared to internal builds. In the sample (300 publicly disclosed AI initiatives), external partnerships with learning-capable, customized tools reached deployment ~67% of the time, compared to ~33% for internally built tools. While these figures reflect self-reported outcomes and may not account for all confounding variables, the magnitude of difference was consistent across interviewees.
More strikingly, employee usage rates were nearly double for externally built tools. These partnerships often provided faster time-to-value, lower total cost, and better alignment with operational workflows. Companies avoided the overhead of building from scratch, while still achieving tailored solutions. Organizations that understand this pattern position themselves to cross the GenAI Divide more effectively.
Hybrid is the way out?
However, according to a June 2025 Boston Consulting Group’s India specific whitepaper ‘India’s AI Leap - BCG Perspective On Emerging Challengers’, “Leaders prioritize scalable systems -- choosing to build, buy, or partner based on required speed and differentiation. MLOps, AIOps, and experimentation frameworks are put in place to continuously deploy, monitor, and retrain models. The tech stack acts as the enabler for experimentation and value delivery.”
Madhur Deora, ED, President & Group CFO of Paytm, at the Modern BFSI Summit organized by The Financial Express in Mumbai recently offered a hybrid approach. “We use AI tools to build AI tools. For example, managing our vast datasets requires custom tools built with AI.” He advised businesses to start with first principles, assess what already works, and only then consider building from scratch. “Some tools built for customer service have turned out useful in finance too. There are efficiencies we uncover only when we start using them,” he added.
“Nobody’s building their own LLMs. We wrap them with our data and deploy them for specific use cases,” said Sayantan Ghosh, Chief Risk Officer, Balancehero India. According to him, for most companies, it’s about operationalising existing models rather than building them from scratch.
The ecosystem approach
A multi-stakeholder angle to building AI solutions can also be considered, “Our approach to AI is very clear, we build it from the ground up, never force-fitting AI into a business solution. We call this our AI-native ecosystem, which consists of three pillars: business users who are ready to consume AI, our ability to manufacture high-quality models, and the technology that makes these models usable in the critical path of decision-making,” says Markandey Upadhyay, CDAO, Piramal Finance.
“Because business users are involved from the start, every solution goes through multiple rounds of iteration—training supervised models, fine-tuning the model as per user feedback, refining the interface, and making adoption natural. That’s how we move from 10% usage to near-universal adoption,” he says.
“The key lesson for us is simple: being AI-native is not about an AI model or one team. It’s an ecosystem where every stakeholder—from data scientists and AI Manufacturers to technology and business users—has a critical role to play,” adds Markandey.
Empower your business. Get practical tips, market insights, and growth strategies delivered to your inbox
By continuing you agree to our Privacy Policy & Terms & Conditions