Deepak Narayanan

Transformation does not follow a straight path. Anyone who tells you otherwise is either selling you something or hasn’t done enough of it. Strategies shift, assumptions crack, and even the best-laid programs hit moments that humble everyone in the room. But here’s what nearly two decades of implementation-focused consulting has taught us at Practus: these moments — the stumbles, the pivots, the uncomfortable reckonings — are where the real learning lives. 

Across the transformation engagements we have managed at Practus — over fifteen hundred companies, from funded startups to Fortune 500s, across nine countries — a clear pattern has emerged: our most valuable insights never came from the engagements that went perfectly. They came from the ones that forced us to stop, rethink, and rebuild our approach. 

We caught up with Deepak Narayanan, co-founder and CEO of Practus, to ask him a few questions about what he and the team have learned over years of implementing transformations.

  1. Looking back across multiple engagements, what are some of the most important lessons Practus has learned from projects that didn’t go exactly as planned?

    DN: A consulting firm’s transformation efforts start with a clearly stated objective for the client. The client organization may want to boost working capital, reduce costs, or increase revenue. But here’s what I’ve learned after working with over one thousand five hundred companies: the problem the client articulates in the first meeting is almost never the real problem. It’s usually a symptom.  

    In the case of a mid-sized pharma company – around 800 crores in revenue – that brought us in saying, “We need to fix our working capital.” Within the initial scope, our team focused on receivables and cash cycles. But three weeks in, the team on the ground realised the real issue was a deeply fragmented procurement process: three different plants were buying the same raw materials from different vendors at wildly different prices, with no centralised visibility.

    In this scenario, if we had just optimised the receivables cycle as initially scoped, we’d have delivered a marginal improvement and completely missed the structural issue. That engagement taught us to always build a “discovery sprint” into the first few weeks — not to validate the brief, but to challenge it.

    The second big lesson is about people, not processes. Early in our journey, we were very process-and-framework-heavy. With a family-owned logistics company — second-generation business, about 500 crores – we designed a brilliant route-optimisation model that could have saved them 12–15% on fuel costs. But the fleet managers, who’d been with the promoter family for 20 years, saw it as a threat. They slow-walked the implementation. That taught us a painful but essential lesson: transformation without trust is just a PowerPoint.

    The Lesson: Such exposure reshapes how transformation starts. Instead of moving directly to solutions after a quick inference of the problem as stated by the client, the early phase must be immersed in operations, data, and decision-making. The approach accepts a slower start in exchange for greater clarity.
  2. Where do things most commonly go off track: problem definition, stakeholder alignment, pace of execution, or something else?

    DN: Organizations assume that once the problem is understood, the hardest work is done. Honestly? It’s usually a cocktail, but if I had to rank them, stakeholder alignment is the silent killer. You can course-correct on problem definition if you have the right stakeholder relationships. What’s harder to fix is when you have a CFO who’s bought in but the COO sees the engagement as an intrusion.

    When our route-optimization model for a family-owned logistics company projected fuel savings of 12-15%, the analysis was sound, but adoption stalled because the fleet managers – tenured people who’d been with the promoter family for two decades – saw it as a threat to their judgement and autonomy. They slow-walked it, and the savings never materialised in year one. The solution existed, but trust did not. Now, we invest heavily in what we call “ground-level alignment” — spending real time with mid-management, making them co-architects of the change rather than recipients of it.

    We saw this play out at a mid-market manufacturing firm in western India. The founder wanted us to professionalise the entire operations backbone, but his son, who managed the factory floor, felt the existing system “worked fine.” The engagement became a tug-of-war. We ended up spending as much energy on internal alignment as on the actual transformation work. The other thing that derails projects more than people admit is data readiness. Clients often overestimate the quality of their own data. A large automotive components manufacturer brought us in to improve profitability. When we pulled product-level margin data, we found 40% of their cost allocation was based on legacy assumptions that hadn’t been updated in years. We had to rebuild the costing model before we could even begin. That added eight weeks to the timeline.

    The Lesson: Working on these projects taught a valuable lesson: stakeholder alignment is a key factor in determining whether change moves forward or stalls. If transformation is simply delivered to an enterprise rather than built into it, resistance may follow. Effective programs engage middle management early, address concerns directly, and involve key stakeholders in shaping the change.
  3. Have there been moments where the team realized mid-engagement that the original approach needed to change? What usually triggers that realization?

    DN: Most organizations get transformation programs started because they have confidence in their data. Absolutely, this happens more often than any consulting firm would like to admit. The difference is whether you have the humility and the systems to catch it early.

    In a case that stands out, we were working with a healthcare services company expanding rapidly across tier-2 cities. The brief was revenue optimization. Our initial approach was classic: pricing analysis, service mix optimization, and marketing funnel improvements. About six weeks in, our team on the ground started noticing something the leadership hadn’t flagged — attrition among front-desk and nursing staff was running at 45% annually in newer centers. No amount of pricing optimization was going to fix a center where patients were greeted by a different person every month.

    We pivoted midstream, brought in an HR workstream, and redesigned incentive structures. Revenue improvement followed, but through a completely different path than what was originally scoped.

    We faced similar friction in an ERP migration for a pharma company. Midway through the project, it was evident that legacy data quality was much lower than anticipated. Instead of pushing through and hoping for the best, our team raised a red flag, proposed a phased go-live instead of a big-bang cutover, and worked with the client’s IT team to build a parallel validation layer. It added cost and time, but saved them from what could have been a catastrophic go-live failure.

    What triggers these pivots? Usually, it’s our on-ground consultants. This is why our model of embedding teams inside the client’s operations is so critical. Sitting in a boardroom, you’d never catch these signals. We’ve also built what we call “30-60-90 reality checks” into every engagement — at each milestone, we explicitly ask: is the original hypothesis still valid?

    The Lesson: Data readiness cannot be assumed as a prerequisite. It is a phase of transformation in its own right. Leaders who recognize this early can plan realistic timelines and avoid delays later.
  4. How do you distinguish between a strategic misjudgment – framing the wrong problem – and an execution challenge during implementation?

    DN: This is a really important question – getting the diagnosis wrong can be expensive.

    A strategic misjudgment occurs when the compass points in the wrong direction. An execution challenge is when you’re pointed right, but the terrain is harder than expected.

    Here’s a concrete example. We worked with a consumer goods company that was losing market share. The initial framing was a distribution problem. We went deep on route-to-market, distributor margins, and coverage gaps. Execution was solid, the team was working hard, but the numbers weren’t moving. About four months in, we looked at the data differently. The issue wasn’t distribution: their product was on shelves. A competitor had launched a value-for-money variant that was cannibalizing them at the point of sale. That was a strategic misjudgment in framing. Compare that with a family-owned textile company where we were restructuring their supply chain.

    The strategy was right, but execution stalled because the purchase head had informal arrangements with certain vendors going back decades — a very different intervention was required.

    This tell-tale sign of a strategic misjudgment is when effort is high but leading indicators don’t move. If your team is executing well and the KPIs are still flat, the problem is usually upstream – you are solving the wrong equation. Execution challenges show up as friction: delays, resistance, timeline slippage, but the directional metrics start showing life once you break through the resistance.

    The Lesson: Sustainable transformation depends on how effectively teams adopt new ways of working. Programs need time for practice, reinforcement, and ownership. When internal teams can run the new processes with confidence, the change required has truly taken hold.
  5. When something doesn’t go as expected in a project, how do you balance transparency with the client while still keeping the engagement moving forward? 

    DN: This is where our entire model gives us an advantage, and I say this with conviction because we built Practus specifically to solve the “skin in the game” problem in consulting. When your fee structure is tied to ROI delivery, and our clients know we commit to 3x to 12x return on fees, transparency isn’t optional. It is structural.

    Even when backed by a strong diagnosis and alignment, things will go sideways. If something isn’t working, the client will see it in the numbers. There’s no hiding behind activity reports or beautifully formatted slide decks. So we have learned to get ahead of it.

    Our practice is what we call “early escalation” — the moment a workstream is at risk, not when it’s failed, we flag it. We come to the client not with just a problem statement but with two or three alternative paths.

    For a revenue optimization engagement with a healthcare services company expanding into tier-2 cities, early work focused on pricing, service mix, and marketing. On-ground teams noticed high attrition among front-desk and nursing staff in newer centers. The frequent staff turnover was also disrupting patient experience and limiting the impact of pricing changes. No matter how we worked on pricing optimization, it could not fix a center where patients were greeted by a different person every month. The engagement had to shift to include retention incentives and workforce dashboards. The revenue metrics improved, but through a different path than originally planned.

    Practus managed a similar reset at a consumer goods company that had lost market share. Initial work centered on distribution, but shelf presence was already strong. It was a competitor’s value-priced product that had changed buying behavior at the point of sale, requiring a rethink on strategy.

    Early escalation plays a critical role in directing course corrections. When a workstream begins to drift, surfacing the risk early empowers teams to make adjustments before momentum is lost. And if they already have alternative paths alongside the problem, they can maintain progress confidently. The approach strengthens trust because obstacles are removed before they become crises.

    The Lesson: It is essential to regularly reassess assumptions. Consistent checkpoints enable the team to confirm whether early hypotheses still hold. The team walks through shop floors for reality checks. It notices the disconnect between what the leaders believe and what’s actually happening. When evidence changes, the approach must change with it.
  6. What internal practices has Practus developed over time to ensure that lessons from one engagement meaningfully improve the next one? 

    DN: Takeaways gained from one transformation are relevant only when they influence the next. This has been a journey for us – our early approach was pretty ad hoc.  Partners and teams would carry lessons in their heads, and knowledge transfer happened over chai conversations. That doesn’t scale.

    Today, we have several structured mechanisms.  The first is what we call “Post-Engagement Reviews” (PERs), where we document what was learned. Every significant engagement goes through a structured debrief, not a superficial “what went well/what didn’t” exercise, but a deep dive where we ask: what did we assume that turned out to be wrong? Where did we create value we didn’t expect? What would we do differently with the same client today? These are facilitated by someone who wasn’t on the engagement, to keep the lens objective.

    Second, we have invested heavily in our Gurukool program, our internal learning engine. Gurukool isn’t just about teaching Power BI or Excel skills. It’s where real engagement stories get codified into case studies. When a team discovers that a particular approach to stakeholder management worked brilliantly with a promoter-led business, that becomes a Gurukool module. When a finance transformation hits a wall because of change management gaps, that becomes a teaching case. We are essentially turning our scar tissue into institutional knowledge.

    https://youtu.be/olSEjTr9hl0?si=lGMkfK_pU-a9efQs – make it a playable embed

    Third, and this is something I am personally proud of, we have built sector-specific playbooks that evolve continuously. Our healthcare playbook today is vastly different from what it was three years ago, because every engagement adds a layer. The pharma chapter now includes a specific section on “data readiness red flags,” which came directly from the ERP engagement I mentioned earlier. Over time, these playbooks also evolve as new patterns emerge in the industries they pertain to. Teams can then start new engagements with stronger context and better questions.

    Finally, there’s the culture piece. At Practus, we’ve tried hard to build an environment where acknowledging a mistake isn’t career-limiting, it’s career-building. We celebrate the team members who flag problems early, who say, “My hypothesis was wrong,” and who pull the team back to reassess. That is harder to institutionalize than any process, but it’s probably the most important thing we have done.


    The Lesson: Insights must be recorded, shared, and revisited. Organizations that make their learnings a system build stronger foundations for future change. This discipline progressively reduces the frequency of mistakes and improves the quality of decision-making.
  7. When you reflect on these experiences today, how have they shaped the way Practus approaches complex transformation mandates? 

    DN: We now invest more time in diagnosis before committing to solutions. Every one of these experiences has been a forge, and I don’t use that word lightly. Three fundamental shifts have happened.

    First, we’ve moved from “solution-first” to “diagnosis-first.” We call it “immersion before prescription.” It’s slower upfront, but it dramatically reduces mid-course corrections.

    Second, change management is now a first-class citizen, not an afterthought. For any engagement involving more than 50 people, we staff a dedicated change management workstream from day one. That logistics engagement I mentioned — where the fleet managers resisted — was the turning point.

    Third, we’ve become much more disciplined about sustainability design. Every solution now has to answer the question: Will this survive six months after we leave? We build capability-transfer milestones and run “shadow periods” in which the client’s team runs the new processes while we watch but don’t do.

    The biggest meta-lesson is this: in consulting, your credibility isn’t built on the engagements where everything went perfectly. It’s built on the ones where things went sideways, and you had the integrity to own it, the creativity to fix it, and the humility to learn from it.

    The Lesson: Credibility is not built through every engagement that runs exactly as planned. It is built on an intuitive habit of reassessing, adapting, and continuing to work until impact takes hold. That’s what “Results Delivered” is about. 

Complex change is never clean. It tests you at every stage — in diagnosis, alignment, data readiness, execution pace, and the willingness to learn. At Practus, failure has become our forge. Not because we seek it out, but because when your business model depends on delivering measurable ROI, you can’t afford to look away from what didn’t work. Every scar has made our next engagement sharper, more honest, and more resilient. That’s what “Results. Delivered” really means — not that we get it right every time, but that we never stop until we do. 

By Deepak Narayanan