Pause For Analysis – Cheryl Scott, Senior Technical Editor, BPI Magazine

Image

For over a decade, BioProcess International has considered analytical methods to be such an integral part of biopharmaceutical product and process development — across the board — that in our rotating thematic arrangement of issues (upstream, downstream, manufacturing, and the new bioexecutive theme in December) we never thought to include a separate “analytical” theme.  After all, production process development (the focus of our January, April, and September issues) involves cell-line engineering and characterization, media optimization studies, and methods validation. Downstream process development (February, May, and October) involves viral safety issues, chromatography optimization studies, and more validation work. With its focus more on product than process, our manufacturing theme (March, June, and November) has often encompassed preformulation/formulation work and product characterization as well as quality assurance and control.

When we looked at the evolving approaches to FDA’s quality by design (QbD) initiative, it all made sense to us. You can’t have any of those things without a strong analytical laboratory and documented understanding of process and product. The dawning biosimilars era underscores our case. The only way that such products can be offered to patients at lower prices than their corresponding originator drugs is if they can be made more efficiently, at less cost, without compromising quality, safety, or efficacy. So how will biosimilars manufacturers do that? They need to cut out expensive clinical trials, for one thing — and the only way to do so is through intense product characterization and comparison with originator samples. And brand new products such as antibody–drug conjugates are presenting greater characterization challenges too (we’ll be looking at those in depth with a fall 2014 supplement on the topic).

Once again, enter the analysts. Is it any wonder that we saw fit to start our “BPI Lab” series in January 2013? So far, I’ve covered some of the most vital technologies to biopharmaceutical laboratories under the auspices of that title:  LC–MS, PCR, spectroscopy, electrophoresis, and more. Later this year, I’ll be looking at calorimetry, light-scattering, and circular dichroism. These are just some of the critical technologies that you’ll see providing data in support of numerous presentations throughout this year’s BPI European Summit program. They’ll even take center stage in Conference 4. And in upcoming editions of BPI’s weekly e-newsletter series, analytical methods will be getting their own focus once a month.

It is true that as analytical methods increase in power, selectivity, and resolution, biopharmaceutical companies may find themselves in a vicious circle: the more you can find, the more you will find. For example, a biosimilar version of a product originally approved 20 years ago may seem to have contamination issues that the originator didn’t — because modern analytical technologies are able to identify contaminants that were not detectable before the turn of the century. In fact, the comparator samples may turn out to be even more contaminated, and perhaps biosimilars makers can document that as part if their own market-authorization application process. It all remains to be seen.

So the modern question becomes, “How much analysis is enough analysis?” Regulators around the world seem to be saying, “Keep looking. Keep up with advancing technology. Keep us in the loop.” QbD and the process analytical technology (PAT) that supports it seem to have arisen at least in part from the FDA’s, EMA’s, and others’ acknowledgment of the analytical advances we’ve seen over the past couple/few decades. But how much are the “watchmen” really able to keep up with this stuff, themselves? And who wants to be the first company to present (and explain, and justify) a new method in its product dossier?

It’s a sticky, tricky issue — and one that won’t be solved any time soon. But the kinds of discussion you’ll find at BPI Europe and in BPI itself are vital in moving us all toward some kind of solution(s) as the industry evolves and continues to mature. And I’m curious about your thoughts on this subject in general.

Regenerative medicine may allow us to regrow heart muscle

by Jennifer Ellis, Director of Marketing, LabRoots

Regenerative medicine may have just received a boost, by allowing scientists to transform skin cells into beating heart cells. This new method, devised by researchers at the Gladstone Institute, is more efficient, and, more importantly, allows for complete reprogramming of skin cells into heart cells for the regeneration of lost muscle in heart attack victims.

Heart disease is a leading cause of death around the world, contributing to an estimated 17 million deaths every year. Advances in medicine and treatments have improved survival rates of heart attack victims in recent years. However, with the growing rate of heart attack survivors comes an increase in the number of people living with heart failure. Heart failure is a chronic condition in which the heart cannot beat to full capacity, most likely due to muscle loss during a heart attack. The scientific and medical communities have now turned to cellular reprogramming as a potential treatment for heart failure, to regenerate the damaged heart muscle itself.

Typically, reprogramming of skin cells into heart cells is a complicated process, requiring the insertion of several genetic factors. Genetic manipulation of this sort directly reprograms the cell and can be very time-intensive.  Additionally, there have been issues with scaling the gene-based method into effective and applicable treatments, such as high cost and time, and having to individualize treatments instead of delivering bulk therapy. Now, a different approach has been discovered by Gladstone Investigator Sheng Ding, PhD, and team.

Instead of inserting various genetic factors, the team searched for small molecules in the skin cells of adult mice that when mixed in certain combinations could make the skin cells behave more like contracting heart cells. The researchers found a small molecule “cocktail”, called SPCF, made up of four compounds that almost completely reprogrammed the cells, causing contracting and twitching behaviours. By adding one genetic factor, Oct4, to the cocktail, the team was able to create a completely reprogrammed beating heart cell, similar in behaviour to the electrical signalling patterns normally seen in ventricular heart cells. These results offer a more efficient and less complicated approach for reprogramming, and could lead to a pharmaceutical-based process to regrow heart muscle.

Technology Transfer for Biopharmaceuticals

by by Richard Dennett, Director, Voisin Consulting Life Sciences

What is Technology Transfer?

Technology transfer happens at each pivotal stage of the product/process development lifecycle of a biopharmaceutical product – this can be, for example, from development into cGMP manufacture, for the purpose of licensing, and as a function of outsourcing contract manufacture.  At each of these stages it is essential to demonstrate that the process remains robust and that the product (which can be influenced by the process) pre- and post-transfer is comparable. To do this it is necessary to transfer the methods, materials, equipment, training and knowhow required to realize the transfer.  Whilst this may look easy when written in a couple of lines of a blog, in reality, and for the manufacture of biopharmaceuticals, this can be very complex and any fault in a single aspect of technology transfer can seriously impact the whole program.

How do you know that you’ve achieved technology transfer?

Key to this is meeting pre-determined acceptance criteria, such as:

  • Qualification that the analytical methods can be reproduced
  • Demonstration that the process can be operated in accordance with the manufacturing instruction
  • Comparability of the product – it is essential that the active product meets equivalent pre-and post-transfer specification – this can involve testing of general specifications through to in depth characterization of higher order physiochemical properties to highlight product integrity or product associated impurities

So what tips have you got for making a successful technology transfer?

  • ‘Buy-in’ – development, manufacturing and quality should be a focus of the technology transfer team and this should be replicated within the giving and receiving party.
  • Planning – plan ahead.  An emerging biotech may think that technology transfer will take a couple of weeks: a seasoned biotech knows that successful transfer can take up to several months.
  • Strong project management and communication is essential

What advice would you give anyone currently embarking on technology transfer?

Build up a solid understanding before you start and, if possible, gain access to someone who’s had experience of doing it before.

What are the trip hazards?

  • Not following the plan of what needs to be transferred, how this will be done and assignment the correct acceptance criteria.
  • Lack of proactive project management
  • Rushing into things

What is your experience with technology transfer?

During my career I’ve worked at the development/cGMP interface of projects, as a technology transfer manager in a contract manufacturing organization and now within regulatory CMC  – so I have had great opportunity to see technology transfer from all angles.

Tech transfer from a regulatory point of view is key for supporting comparability within the dossier submission and technology transfer protocols and reports may be scrutinized as part of the pre-approval inspections – therefore it’s serious stuff.

What are your key take home messages?

  • Treat technology transfer seriously
  • Take time
  • Gain buy in
  • Plan correctly
  • Make sure you know what you are transferring and how you will achieve acceptance of the transfer.
  • If it isn’t written down it never happened – quality documentation –  technology transfer protocols and reports are essential.

We heard that you’re a fan of paper aeroplanes? 

Yes, look out for these at BioProcess International.  I’m actually an amateur pilot and recently had a crash landing in the last plane I flew  – a 1930s designed ‘flying flea’ – technology transfer of the understanding of the flight characteristics was to blame –  paper aeroplanes are much safer!

2014 – Why is technology transfer important today and what added value does it hold?

Technology transfer is essential. Correct technology transfer can mean the difference between a successful or failed project.  Done correctly, technology transfer can build in product/process robustness and positively expedite your development program therefore reducing costs and time to market.

Trends, technologies and interest in Continous Processing

by  Dr Margit Holzer, Scientific Director, Ulysse Consult

Minimizing operational costs and capital expenditures is a considerable challenge for production in the Biopharmaceutical industry.  Many different contributions have to be taken into account when calculating these costs – capital investment, materials, consumables, labour, utilities, waste handling together with development, scale up, qualification and validation time and costs, etc – all add up to make production expensive.

One way to address this issue is to switch processes from batch ones to continuous ones; which are well established in other industries like food processing.  When switching, the entire physical processing installation could be shrunk: necessary clean rooms, utilities, tank farms, processing and other equipment can often be reduced by half and operating costs for production, cleaning, analyses can be reduced significantly.

In 2007, Novasep published a cost study showing that a switch from batch to continuous downstream processing of Monoclonal Antibodies could reduce operational costs by 69% – a huge reduction and a very promising way to achieve a more economical production chain.

The question is then: how can one convert current batch processes into continuous ones?  Related challenges include issues of quality and process control and the need for further process understanding and characterization. However, modern technologies using the approach of Process Analytical Technologies (PAT) for  in-line/at-line/on-line controls allow getting the adequate level of information on process performance as well as a tighter control of product quality during development, scale up and production. A more structured & scientific approach of process characterization and design facilitated by using process simulation tools is equally desirable both for batch and continuous process, for  developers and for regulatory bodies.

Switching a batch chromatography step to a continuous process has been challenging during many years.  The main reasons were/are:

  • the chromatographic process needs to be robust and integrate cleaning steps,
  • the equipment (including that required for process monitoring/control strategy) needs to be extremely reliable, cleanable and easy to handle & maintained,
  • it must be possible to qualify and validate the equipment, the software, the process and the cleaning to meet regulatory standards,
  • the “where is my batch syndrome”: while the batch size is perceived as naturally defined in a discontinuous process, it requires definition with continuous processes. Let us notice that these definitions have already been accepted by authorities for several  APIs.

Last but not least, regulatory bodies like the FDA support the implementation of continuous manufacturing using a science and risk-based approach because this can result in an improvement of critical quality attributes, more process control, less product holding time and processing in steady state conditions.

The EMA and FDA assess QbD

“The European Medicines Agency (EMA) and the United States Food Drug Administration (US FDA) have published a joint question and answer document that outlines the conclusions of their first parallel assessment of quality-by-design (QbD) elements of marketing-authorisation applications.

Quality-by-design is a science- and risk-based approach to pharmaceutical development and manufacturing that was introduced a few years ago in international guidelines intended for the pharmaceutical industry. QbD involves the use of statistical, analytical and risk-assessment methods to design and develop pharmaceutical compounds and manufacturing processes to ensure the quality of the manufactured product.

. . .

The objective of this parallel assessment is to share knowledge, facilitate a consistent implementation of the international guidelines on the implementation of the QbD concept (International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines Q8, Q9, Q10 and Q11) and promote the availability of pharmaceutical products of consistent quality throughout the European Union and the US.”

At BioProduction this year we have a dedicated QbD track:
QbD: Development through to Biomanufacturing

At this year’s event we have case studies highlighting the successful implementation of QbD; advice on how to get your QbD programme passed by the regulators; and the implementation of QbD for upstream processing.

Our QbD speakers include:

  • Dr Mark Uden, Head of Biopharm Process Research, The Biopharm R&D Unit, GlaxoSmithKline, UK
  • Dr Patrick Gammell, Principal Development Scientist, Pfizer, Ireland
  • Dr Alex Eon-Duval, Project Coordinator, Biotech Process Sciences, Merck Serono SA, Switzerland
  • Dr Brij Patel, Deputy Manager and Assessor, MHRA, UK

For more information about BioProduction 2013 and what you can learn from our QbD expert speakers, visit our website