Skip to main content

Continua Sponsors PAT & Real Time Quality Summit

Join us December 11-12 in Philadelphia!

Continua is a proud sponsor of the 3rd Annual PAT conference: Successfully Implement PAT Across the Drug Development Lifecycle to Increase Automation and Drive Cost Efficiency. This conference is designed exclusively for the Pharmaceutical and Biopharmaceutical community. Join 100+ leading PAT scientists, engineers, process development, CMC, MSAT and quality experts and discover interactive workshops addressing crucial challenges across regulatory requirements and implementing automation.

The maturity of PAT tools and the development of sophisticated software and chemometrics now provide the perfect opportunity to revolutionize traditional quality and analytical approaches, achieving the digital manufacturing maturity necessary for the future. With drug developers and manufacturers facing closer scrutiny on product quality than ever before, it’s time to make the most of evolving process analytical technology (PAT) to design robust and cost-effective control strategies to understand more about your biopharmaceutical products. 

Connect ground-breaking technologies with real-world manufacturing investment decisions to:

  • Explore cutting-edge data-driven case studies
  • Engage in dynamic roundtables
  • Deep dive into KOL-led panel debates

The Next Step in Your Digital Strategy

How to Assess and Build Your DX Strategy With
a Digital Transformation Maturity Assessment

One method to facilitate the creation of a Digital Strategy and a Roadmap to begin the journey, is to start with a Digital Transformation Maturity Assessment (DTMA). A DTMA is a comprehensive evaluation process used to determine an organization’s current level of digital maturity. This assessment helps organizations understand where they stand with digital transformation and identifies areas for improvement. It creates a client’s specific roadmap for starting or continuing their digital transformation journey—including current state, future state, basic architecture, risks and first steps. 

Our deliverable is a final report that includes the evaluation, assessment, road map and maturity score. Key outputs are a digital strategy, a recommended basic architecture & technology, and a recommended proof-of-concept project to demonstrate the technology. Below is a graph depicting excerpts from an actual DTMA performed for one of our clients.

These suggested actions are representative of some of the potential benefits of taking the first steps toward adopting a corporate-wide digital strategy and ensuring your plants are on board. Please reach out to me to schedule your DTMA.

The Ideal Architecture for Digital Transformation

Know the ‘Current State’ of Your Business AND
‘Future State’ of Your Business in Real Time.

As covered in my last article, the first steps toward a digital transformation are to develop a strategy, get buy-in, and create a roadmap aligned with the business. Now, you’re ready to select a proof-of-concept pilot project. Typically, the pilot should be small enough to be completed over a 3-4 month period and use an Agile project management approach. An Agile approach promotes flexibility, collaboration, and continuous improvement throughout the project lifecycle—which is particularly important for the introduction of a new technology. In addition, smaller projects are more easily funded and quick wins can help justify bigger capital investments as the scope grows.

The process of digital transformation and the implementation of advanced technologies, such as Machine Learning (ML) and Artificial Intelligence (AI), require vast amounts of data. This data is not useful unless it can be interpreted in a meaningful way. Context adds meaning to raw data and transforms it into actionable information. We’ve all heard about the failures of Big Data projects due to lack of data contextualization.

The ability to transmit and contextualize large amounts of data throughout the enterprise requires rethinking how OT networks are architected. Traditionally, the main focus has been to protect the data behind firewalls, the argument being that opening ports in the firewalls begins to compromise the integrity of existing cybersecurity systems. The most common OT network architecture is the vertically stacked Purdue model, where hardware and software solutions are siloed, and vendor locked. In this scenario, unlocking the data requires custom protocols and licenses—not exactly a viable approach. Therefore, a new architecture has emerged, that is better suited for large amounts of data and enterprise-wide data buses, called the Hub/Spoke model.


A Hub/Spoke architecture uses: a low-bandwidth protocol called MQTT, an efficient data transmission method called publish/subscribe (or pub/sub), and data brokers that scale to high traffic volumes and provide redundancy. This network structure can communicate seamlessly with cloud applications because they are already standardized on MQTT. A framework, called the Unified Namespace (UNS), is superimposed on top of the Hub/Spoke network to further extend its capability as an enterprise-wide data hub. The combination—a Hub/Spoke network and UNS—forms the ideal infrastructure for an enterprise-wide data hub to support Industry 4.0 and predictive analytics, like ML and AI. Hub/Spoke is the backbone; MQTT data brokers provide the ability to transport large amounts of data; and the UNS provides a contextualized framework for interoperability and seamless data access.


Ultimately everything and everyone is plugged into the network. As a single source of truth, your UNS is a virtual representation of the business in real time which forms the basis for predictive analytics and actionable decisions. This real-time data is used to collect and analyze data/information to leverage ML/AI. ML predicts future outcomes based on past patterns and the current state. The layers of the business are integrated and operate based on data and information from all layers—in real time. Stakeholders know the ‘current state’ of the business in real time and stakeholders know the ‘future state’ of the business in real time.

Now that you’ve been exposed to Hub/Spoke and UNS, you probably have some questions about your selection of a proof-of-concept pilot. Please reach out to me to start a dialog.

The Business Case for Digital Transformation

Unlock New Revenue Streams. Innovate Product Offerings.
Build Resilience Against Market Disruptions.

There are numerous white papers and surveys indicating that most CEOs believe data-driven transformation is crucial to their companies’ futures. Furthermore, the popularity of generative AI (ChatGPT, LLM, ML) allows us to believe that this technology is within practical reach. At the very least, manufacturers know they need to start investing in generative AI technologies in order to stay competitive.

The end-goal is that embedded AI and Machine Learning models will fuel predictive analytics and proactive decision-making—optimizing operations and maximizing quality, production, and yield.

We see and hear that companies do not have enough visibility into plant operations. For strategic planning and flexibility, a company needs to know the current state of its operations. Digital transformation empowers companies to optimize operations, enhance decision-making, and improve customer experiences through data-driven insights and automation. It enables companies to stay competitive in rapidly evolving markets by reducing costs, increasing efficiency, and accelerating time-to-market for new products and services. By leveraging advanced technologies, like AI and Machine Learning, businesses can unlock new revenue streams, innovate their product offerings, and build resilience against market disruptions.


Your first actions should be to invest in a clear strategy and vision aligned with your business and to secure executive sponsorship. Companies need a digital strategy to provide a clear roadmap for leveraging technology to achieve their business goals, ensuring alignment between digital initiatives and overall company objectives.

A well-defined digital strategy helps businesses navigate the complexities of digital transformation, prioritize investments, and address potential challenges like data security, integration, and change management. It enables companies to stay competitive by adapting to market trends, enhancing customer engagement, and continuously improving operational efficiency in an increasingly digital world.


Now that you understand the essentials to getting started on your Digital Transformation journey, I’ll next explain the ideal architecture needed for digital transformation to occur. In the meantime, please reach out to me with your comments or to start a dialog.

Paul Brodbeck to Present at Continuous Manufacturing Forum

September 17-19 | Princeton, NJ

Join us at the Continuous Manufacturing Forum, where Continua’s Paul Brodbeck will be speaking on Advances in Technology.

The Continuous Manufacturing Forum serves as a catalyst for compelling conversations, bringing together key opinion leaders in biotech and pharma. It unites executive leadership, directors, senior scientists, consultants, and a closely-knit network of CDMOs, academic leaders, and equipment providers.

Paul Brodbeck
Chief Technologist
Continua

Augmenting an MES

This pharmaceutical customer has a large amount of data that needs to be contextualized and brought into their MES, but that’s not what an MES does best. Our project entails bringing that data into a Unified Namespace (UNS) ecosystem and then back into the MES, once contextualized. If you want to use your data outside of the MES, for instance in an ERP or other business systems, then the number of links in and out can grow exponentially. Our eventual goal is to get away from proprietary software solutions that sit in the middle of your stack. When you move to a UNS, you can use any software and it’s easy to upgrade or switch, as well as much more flexible. Contrary to what many people may think, security is better too. Each layer has a firewall and the data is well-protected; if you open more ports, there’s inherently more risk. By the nature of the way a UNS works, getting IOT data out is less risky because you’re publishing out with no need to open inbound ports. In fact, Gartner is now recommending this architecture in lieu of the traditional Purdue model.

Integrating Lab Automation Systems

Every pharmaceutical company uses manual laboratory automation systems, like the Nova FLEX cell culture analyzer, and frequently they need to perform calculations based on the results. Continua worked with this company to integrate their Nova FLEX data with their plant control system to automate the required actions based on the analysis results and to publish this data into the UNS for visibility across the business area.

Monitoring High-Value R&D Materials

This biotech company had already started its Unified Namespace (UNS) journey with a proof-of-concept project in mind, our recommended approach. As one of the most advanced companies in their industry, they wanted to use the UNS technology platform as a way to move a manual process—that of monitoring and tracking its highly valuable R&D materials data throughout the earliest stages of drug development—to a cloud-based, contextualized, and accessible asset. UNS is so new, that it is not well understood by many companies and is too risky to roll out on a larger scale. Both the customer and Continua, while equally committed to UNS architecture, feel that these projects are best approached in an agile, iterative way, with engineering adjustments made to find the best path forward.

MES Data Informs Analytics

Manufacturing Execution Systems are a rich source of data for manufacturing analytics. They often provide information and context that are valuable for optimizing manufacturing and are not available in any other systems. However, extracting analytics data sets from these systems is challenging. Data generated by an MES is not structured and labeled for retrieval, but rather is organized in the context of manufacturing recipes which vary from site to site, product to product, batch to batch, and over time. The result is that many companies are not realizing the full value of their MES data to support analytics.

One way to attack this challenge is to hire a highly trained MES engineer to add code to export MES data and context in the format needed for analytics. While straightforward to implement, this approach is often time-consuming and expensive. It also requires constant upkeep and can result in MES and reporting performance issues. This approach can also add considerably more load on recipe authoring and testing, both initially and during lifecycle change management.

Another tactic is for the analytics tools to query the MES recipe directly. This approach can be faster to implement as it requires no recipe changes and can be tailored to the unique needs of the specific analytics platform. Unfortunately, because of the variability of recipe execution, changes to recipes over time, and limitations of the analytics software, these queries are often complex, brittle, and difficult to maintain. This solution can also result in duplicative efforts as custom queries must be crafted for each analytics tool. 

A third solution is to create a data presentation or abstraction layer, for example a Unified Namespace (UNS), as multiple systems want similar data and context from an MES. This third approach takes time and thought and may bring with it new systems and data flows—you will likely want to start with a pilot project. Done properly, the result can be a comprehensive and scalable solution that is robust and efficient to maintain. Depending on your situation, any of the above methods might be best.

At Continua, our focus is to understand what data you need and how you want to use it. That understanding then informs the solutions we recommend, whether it be a targeted point solution that addresses an immediate need or a broader, holistic solution to provide a foundation for a wide range of analytics goals.