Compliance Is No Longer the Goal, It Is the Byproduct
inConversation with Mr Sekhar Surabhi
As pharmaceutical manufacturing transitions from static, batch-driven paradigms to dynamic, data-centric ecosystems, the definition of control, quality, and compliance is undergoing a fundamental shift.
In this conversation, Sekhar Surabhi, Founder & CEO of Caliber Technologies, offers a practical and systems-level perspective on how Pharma 4.0 is reshaping operational thinking, moving beyond parameter-based oversight to behavior-driven control, real-time decision-making, and embedded compliance. Drawing from deep industry experience, he explores why true transformation lies not in layering digital tools onto legacy processes, but in reengineering the interplay between data, systems, and quality frameworks to enable consistency, traceability, and intelligent autonomy at scale.
In Pharma 4.0 environments, where systems are designed to continuously generate, contextualize, and act on process data, how should organizations redefine the concept of a “state of control” beyond traditional parameter monitoring, particularly in terms of system behaviour, feedback loops, and autonomous adjustments?
In a Pharma 4.0 environment, I don’t think we can continue to define a “state of control” purely in terms of parameters being within limits at a given point in time. That definition is too static for systems that are inherently dynamic.
For me, control is about how a system behaves over time, how it responds to variability, whether its feedback loops are functioning as intended, and whether any adjustments it makes are predictable and explainable. There is a clearly defined model of digital maturity in Pharma 4.0 guidelines, in the five-stage process of digital maturity, final stage being “Optimized” which is a continuous process. A process may appear to be in control from a parameter standpoint, but if its behaviour is inconsistent under slightly different conditions, that’s a deeper concern, and needs optimization.
So the shift I see is from monitoring values to understanding behaviour. A true state of control is one where the system demonstrates stability, traceability, and consistency in how it operates over a period of time, not just where it sits at a particular moment.
As manufacturing operations transition from batch-wise review to continuous data streams, how does this shift impact decision latency, escalation pathways, and deviation management, especially in high- throughput or continuous manufacturing settings?
In my experience, the most immediate impact is on decision latency, what was once reviewed now needs to be interpreted and acted upon in real time.
This exactly points to “Control strategy” where every batch produced is compared in real time, by leveraging statistical tools to see where the process capability is heading to, getting better or declining. Decision-making becomes real time. In fact, there is a special group under Pharma 4.0 working on this to establish guideline.
This changes escalation pathways quite significantly. Traditional hierarchical escalation models tend to slow things down, whereas continuous environments require more immediate and event-driven responses. The system has to be able to surface what matters, and the organization has to be structured to respond without delay.
Deviation management also evolves in this context. Instead of investigating deviations after a batch is complete, the focus shifts to identifying and correcting them as they emerge.
Many organizations implement digital layers without fundamentally re- engineering underlying processes. From your experience, what are the most common failure modes where Pharma 4.0 initiatives fall short of delivering “compliance as a byproduct,” and what systemic gaps typically drive these outcomes?
One of the most common patterns I have observed is that organizations try to digitalize existing processes without fundamentally rethinking them. In many cases, just a department or a function is looked at for digitalization. In such cases, digital systems end up in silos replicating manual inefficiencies rather than resolving them.
What happens then is that you get better visibility for a function, but not necessarily better holistic control. Data exists, but it lacks context or correlation. Systems are automated, but the underlying decision logic isn’t clearly defined. As a result, compliance still requires effort as it doesn’t emerge naturally from the system.
In my view, Quality Maturity should be seen as foundation for digital maturity pathway, which would ensure compliance is a definitive outcome. The core issue is often a lack of alignment between process design and data architecture. Unless those are addressed together, Pharma 4.0 initiatives tend to fall short of delivering compliance as a byproduct.
When compliance is embedded within system design rather than enforced through retrospective review, how should traditional GMP constructs such as batch release, review-by-exception, and quality oversight evolve, particularly in the context of real-time or parametric release frameworks?
I can say batch release becomes less of a checkpoint and more of a continuous assurance that the process has remained within a defined state of control throughout.
Review-by-exception becomes meaningful if the system is capable of reliably identifying what truly constitutes an exception. Digitalization should bring important checks & balances in place, to ensure lesser human decision and more of a policy/ rule-based decision. This is foundation to review by exception. Otherwise, you either miss critical signals or generate too much noise.
Quality oversight, in this context, shifts from reviewing outputs to governing the system itself, ensuring that rules are correctly defined, that systems behave as expected, and that any deviations are both meaningful and traceable. Overall, I can say the emphasis moves from volume of review to confidence in the system.
As decision-making increasingly shifts toward algorithm-assisted or system-driven actions, how should validation frameworks expand to address not only process performance, but also the reliability, explainability, and governance of embedded decision logic within digital systems?
I believe if the systems become more autonomous or algorithm-driven, they are no longer sufficient to validate outputs alone.
We need to understand whether the underlying logic is reliable under different conditions, whether the decision pathways are transparent, and how changes to that logic are governed over time. In other words, while single set of results are validated by the algorithms used, we need to move towards group of decisions made over a period and look at the validity of the algorithms looking for larger data sets.
This becomes particularly important in environments where systems are adaptive or continuously updated. Validation, in that sense, becomes less of a one-time activity and more of an ongoing discipline.
From a regulatory and inspection standpoint, how are agencies interpreting highly automated, data-centric manufacturing environments where human intervention is limited, and what forms of evidence most effectively demonstrate control, traceability, and data integrity in such setups?
From what I have seen, regulators are not opposed to highly automated or data-centric environments. What they are focused on is whether control, traceability, and data integrity can be clearly demonstrated and attributed to an event.
In automated systems, the expectation shifts slightly. Instead of relying on human intervention as evidence of control, organizations need to show that the system itself is well understood. That includes clarity of logic, completeness of audit trails, and the ability to reconstruct what happened and why.
In my experience, what resonates most during inspections is not complexity, but clarity. If an organization can clearly explain how its systems function and how decisions are made, it builds confidence regardless of how advanced the technology is.
Looking ahead, as AI-driven systems and digital twins begin to influence or autonomously execute process decisions, what new risks emerge around decision opacity, model drift, and unintended process variability, and how should organizations design controls and oversight mechanisms to ensure sustained compliance and trust?
In regulated environments where consistency accountability is critical, if models evolve in ways that are not fully visible, it becomes difficult to maintain confidence in the system.
The way forward, in my view, is not to limit the use of these technologies, but to design appropriate controls around them with human in loop. That includes continuous monitoring of model performance, clear boundaries for autonomous actions for non-critical actions and mechanisms for human oversight for all compliance critical actions.
Ultimately, the question is not whether systems can act autonomously, but whether that autonomy operates within a well-defined and controlled framework, for now, making sure that ‘human in loop’ is critical for all GMP critical operations.
Disclaimer: The views and opinions expressed in this editorial are those of the interviewees and are based on their professional experience in pharmaceutical engineering and sterile manufacturing. They do not necessarily reflect the official views, policies, or positions of Hello Pharma, its management, or its affiliates. Hello Pharma does not endorse or take responsibility for any specific technical, commercial, or regulatory interpretations presented in this article. Readers are encouraged to independently evaluate the information shared, review applicable regulatory guidance, and rely on their own experience, expertise, and professional judgment before making decisions related to equipment selection, system design, validation strategy, or regulatory compliance.
