Optimizing Biomanufacturing Using Three Different Methods

A recap from the morning sessions during day 1 of BPI in Boston, MA.

Biomanufacturers are using increasingly sophisticated methods to optimize their processes to maximize the efficiency of their production assets and reduce their cost of goods in order to gain competitive advantage.

  1. Design of Experiments

Martin Kane, a Data Scientist from Exponent described how engineers can apply Design of Experiments (DoE) methods as a powerful set of tools for understanding and optimizing bioprocess systems. DoE allows companies to understand the average “Design Space” for their processes and describe them with simple mathematical models. In this way, Kane (who was speaking at the Manufacturing Optimization and Process Intensification: Predictive Models and Product Lifecycle Management session at the BioProcess International Conference & Exhibition, 2016) allows manufacturers to reduce their development time and increase throughout said.

Kane recommended that companies use a three-step approach in which DoE screens are performed to identify factors that could influence the process, secondly the design space is characterized and then, finally, the models are verified using a small number of process runs. This verification stage allows companies to demonstrate that the defined operating ranges are robust yet unfortunately, engineers often miss this stage.

  1. Computational Fluid Dynamics

Suresh Nulu, a Senior Engineer from Biogen Inc presented at the same conference session. He described a case study on the use of predictive models to optimize drug product mixing processes, “Faster, Cheaper, Better”. He explained that ‘at-scale’ mixing studies with drug product are very expensive to perform. Furthermore, the costs associated with a failed mixing step are extremely high. Biogen wanted to develop a comprehensive data package that leveraged both experiments and Computational Fluid Dynamics (CFD) for different single-use mixers under a range of conditions. By having this data package the company hoped to avoid ‘at-scale’ studies but the new approach required a culture shift as staff learned to trust the data generated by predictive modelling.

CFD modelling data proved its utility and helped to identify dead-zones that engineers may not have otherwise identified due to the location of probes. Biogen verified the models using experimental data to ensure that they were “fit for purpose” predictive models. The experimental and CFD data aligned very closely with one another.

The mixing data packages created define mixing times at different impeller speeds and at three different mixing volumes. The company identified upper and lower boundaries that avoid dead-zone at low impeller speeds and both vortexing and foaming at high impeller speeds. Control limits for mixing steps were subsequently defined within the design space.

By combining the use of experiments with CFD modelling, Biogen was able to proactively explore the design space and;

  • Eliminate ‘at-scale’ surrogate studies during process transfers.
  • Make data-driven equipment-sizing decisions during process transfers.
  • Eliminate ‘at-scale’ studies often used to understand the effects of equipment on Biogen’s drug product.
  • Set process controls and sampling strategies during process development.
  1. Continuous Process Verification

Concluding the session, Tom Mistretta, Principal Engineer at Amgen described the company’s approach to maximizing the value of commercial-scale data. Mistretta explained that the driver for this work were the needs of their increasing complex product pipeline with multiple modalities, expanding geographical footprint and the availability of systems that effectively aggregate data.

The company uses continuous process verification as part of their lifecycle management approach. Mistretta pointed out that a typical commercial batch will generate over 500 QC entries, 2000 batch record entries and 500 million continuous data point entries. Biotech companies can use this data to identify and control sources of bioprocess variation leading to further process performance optimization.

Mistretta gave an example of a case study in which Amgen analyzed data from commercial-scale lots to identify a process performance parameter that was under sub-optimal control. The company was able to improve the control of the parameter using process analytical technology, which resulted in higher process yields without having a detrimental impact on the product’s critical quality attributes. The insights gained from continuous process verification are being fed back into the process design activities of pipeline processes. This is allowing Amgen to increase their staffing and facility flexibility. Furthermore, the process improvements are reducing the number of manufacturing lots the company needs to run, it is reducing the environmental impact Amgen is having and improving sustainability.

 

About the author: Nick Hutchinson is a Technical Content Marketing Manager at Sartorius Stedim Biotech.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s