Revolutionary Algorithm Mogrify Set to Transform the Field of Regenerative Medicine

Credit: Nature Genetics & Rackman et al.

This week a paper published by a team of international researchers at the University of Bristol has sent excitement rippling through the cell therapy community. The team comprised of collaborators from Bristol, Australia, Singapore and Japan published the breakthrough last Monday (18/01) in Nature Genetics. They presented the creation of a predictive system (Mogrify) that can forecast how to create any human cell type from another cell type directly, bypassing the need for exhaustive trial and error.

The team led by Julian Gough have so far applied Mogrify to 173 human cell types and 134 tissues, outlining an index of cellular reprogramming.  Speaking about the breakthrough Gough, professor of bioinformatics, revealed to the University of Bristol that “the barrier to progress in the field is the very limited types of cells scientists are able to produce. Our system, Mogrify, is a bioinformatics resource that will allow experimental biologists to bypass the need to create stem cells”.

Secondly, when listing further achievements from the research, conducted in collaboration with Professor Jose Polo at Monash University in Australia, Gough confirmed that Mogrify had validated two new transdifferentiations. The algorithm succeeded first time in validating both of the new transdifferentiations, and it is this speed in achieving results that lends clear indication to the claim of Mogrify being revolutionary. Professor Gough added that he hoped “Mogrify will enable the creation of a great number of human cell types in the lab”.

This is a huge result for the regenerative medicine field and will no doubt speed up advances in life-changing medicines. Particularly, the ability to produce a number of types of human cells will directly lead to new tissue therapies, and a much improved understanding of cell production at a molecular level. One hope going forward is the potential to grow whole organs from someone’s own cells.

For five years Gough collaborated with Dr Owen Rackham to create a computational algorithm to predict the cellular factors for cell conversions. This was achieved largely in thanks to data collected as a part of the FANTOM international consortium, of which Gough is a member.

To highlight the size of the achievement, it must be remembered that scientists have only been able to discover conversions of human cells a handful of times since Japanese researcher Shinya Yamanaka created the first human artificial pluripotent stem cells in 2007.

The algorithm has been released and made available online for others to use, so that the field may advance at a much faster pace.

The paper ‘Mogrify: An Atlas for Direct Reprogramming Between Human Cell Types’ by Rackham et al in Nature Genetics.

Challenges Facing the Cell Therapy Community – And how to Overcome those Hurdles

It’s no surprise that cell therapy companies face a myriad of hurdles when developing a product from bench to bedside.

The range of challenges can sometimes feel overwhelming – even for the most seasoned pros in regenerative medicine. So we set out to ask some of the leading names in the industry what some of the biggest hurdles are, and what can be done to overcome them.

Aby Mathew Ph.D., Senior Vice President and CTO, BioLife Solutions.

Aby believes one of the biggest challenges is that companies set up their processes for clinical feasibility, not necessarily commercial viability. ‘So they’re focused on, does the therapy work, not necessarily can we scale it up, can we make it economically viable, can we deliver it globally?’ Aby says.

So what’s the solution? Aby believes it’s down to human capital and mass manufacturing: ‘Down the road, what they might look to do is automate their processes, so instead of having a manufacturing process that might take 50 or 100 people, they might be able to implement a machine that can reduce the head count, but can also reduce the possibility of human error.’

Another solution, Aby says, is to buy in bulk. ‘In the manufacturing process, things such as media components, instruments that you have to use, or components where its packaged up, all of those end up adding to the individual cost. But when you go and buy that from your suppliers, you’re going to get a much better price if you order 1,000 units rather than 10 units and that’s really where that scale-up starts showing the cost of goods benefit.’

Jon Rowley, Ph.D., Chief Executive and Technology Officer, RoosterBio, Inc.

Jon Rowley agrees. He believes a big challenge that many companies have is being able to manufacture products at commercially relevant lot sizes while maintaining all the quality parameters of the cells. ‘They need to make sure that those stem cells don’t lose the biological functionality that they have that makes them therapeutic in the first place. And during scale-up, this can definitely be lost if you’re not looking at the right things,’ Jon says.

His solution? Jon says companies need to understand what the quality parameters are, get the technologies in place to manufacture at a much larger scale, and then manufacture these cell therapy products at costs that enable companies to go to market with them and actually have a profit left over at the end of the day.

Julie G. Allickson, Ph.D., Director, Regenerative Medicine Clinical Center, Wake Forest Institute for Regenerative Medicine

Julie Allickson notes that regulation and due process can pose a challenge to many companies. She advises, “moving to regulatory approval in an efficient manner means really early discussions with the FDA, being able to share with them your knowledge in regards to the technology, and to be able to get their input.” She adds there are several opportunities to talk to the FDA along the way that companies should take advantage of in order to make the process run as smoothly as possible.

Discover more technical and commercial strategies to deliver cell and gene therapy products to patients at the Cell Therapy Manufacturing & Gene Therapy Congress. View the agenda and buy tickets here:

How to Master Data Management during Bioproduction?

With the advancements in technology we are able to make better and wiser decisions that can impact the way we develop products nowadays however with better analytics comes more data and the question now is how to manage this. We asked Jarrod Medeiros, Product Specialists at the leading data management company IDBS about their thoughts on the subject and what advice they could share.


Informa: you recently published a white paper on the subject. Why was that?

IDBS: Because we wanted to educate the community on some of the problems we’ve solved for similar customers.

Informa: How can ineffective data management impact the development of biologics?

IDBS: There are a few effects. A major one is efficiency, we’ve found through working with various customers that a large portion of scientists’ time is spent looking for data, transcribing data, and other low value activities associated with paper notebooks, binders, and MS Excel. Another big effect we’ve seen is on quality. This includes the quality of the data itself which can be compromised due to hand writing and transcribing between systems. But also the quality of the output, when data is ineffectively managed the reports and analyses that can be done are limited.

Informa: What advice would you give to industry to handle the vast volumes of data produced in biological development?

IDBS: Use E-WorkBook?! Come talk to us!? I suppose the first step is to admit it is a problem, and then work with vendors that have expertise in solving these challenges.

Informa: How should data management systems be applied to improve decision making in the development and manufacturing of biological products?

IDBS: One of the largest impacts on decision making comes from the ad-hoc analyses that can be done. It’s easy to prepare data and visualizations that need to be used regularly, but when something unexpected happens it is time consuming and sometimes impossible to pull data together in an ad-hoc manner. Data management systems should capture the relationships and context required to be able to quickly view traceability and make decisions.

Informa: What do you see as the best strategies for gaining insight from the increasing amount of data generated and translating this data into knowledge outcomes for products and processes?

IDBS: The best strategy is to move away from a paper and MS Excel-based process as soon as possible. The longer you wait to implement an electronic system for data management, the more of your historical data continues to be in an unusable state.

Join us at BioProduction 2015 to hear more from Jarrod Medeiros, where he will be presenting on leveraging laboratory informatics for the bioprocessing industry on 15th October at 12:05.

The Economic and Operational Benefits of Single-Use Technologies

Single-use or disposable bioprocessing technologies continue to gain greater acceptance within the biopharmaceutical industry and are now commonly used. The nature of the decisions that bioprocess engineers needed to make around single-use technology has changed since its introduction which was actually relatively early in the history of the biopharmaceutical industry. Early studies investigated whether manufacturing in disposable equipment was even feasible but once technical feasibility had been established the focus switched to whether there were economic or operational advantages in investing in the technology to a greater or lesser extent. Many bioproduction practitioners came to the conclusion that the benefits of single-use were real.

Single-use technologies: The economic case

From an economic perspective single-use technologies reduce capital expenditure on equipment and on utilities. Although the costs of consumables may have an upward pressure on the ‘cost of goods’ this is balanced to some extent by the reduction in costs of running steam, cleaning and water generating facilities used in stainless steel facilities to clean and sterilize re-useable equipment. The time taken to build and validate facilities is reduced as their complexity decreases which further reduces costs, however, critically it also allows the decision to invest in a facility to be delayed until the likely success of the biopharmaceutical product in the clinic is better understood. Given the high attrition rate of pharmaceutical products as they pass through the clinical trials process the benefit to the sponsoring company is significant.

Single-use technologies: The operations case

From an operations point of view, single-use technologies increase speed by reducing downtime required for cleaning and sterilization protocols and increase flexibility both by allowing multiple process configurations for different products to be implemented readily in the same facility and enable process modifications to be facilitated particularly prior to the process being approved by regulatory authorities. The quality performance of operations can be improved by using disposables as the risk of product and batch cross contaminations can be avoided, however, debates continue to rage regarding the nature and extent of testing that must be performed on the single-use parts to show that they have no impact on the safety or efficacy of the biopharmaceutical. Further, the nature of biomanufacturing operations must change to accommodate disposable with sponsors becoming increasingly reliant on the quality organizations of their suppliers. Sponsors must also consider the robustness of their supply chain ensuring the dependability of their own operations so that their biological medicines are always available to the patients that need them.

The future of single-use technologies

New single-use technologies are being developed and evaluated, companies are applying single-use concepts to increasingly diverse bioproduction processes and the technology is being utilized in new manufacturing paradigms such as continuous biomanufacturing. To an ever growing extent, however, the debate is switching from ‘if disposables should be implemented?’ or ‘to what extent?’ towards the questions about how to implement most effectively. Key questions remain on extractables testing, supply chain security, standardization and facilities designs that best accommodate single-use processes. All of these and more will be discussed at BioProduction 2015 in Dublin on the 14-15 October.

Dr Nick Hutchinson

Dr Nick Hutchinson

Join me at #Bioproduction15

Contact me at

Dr Nick Hutchinson has a Masters and Doctorate in Biochemical Engineering from University College London, UK where he focused on laboratory tools for rapid bioprocess development and characterization. He then worked at Lonza Biologics in an R&D function investigating novel methods for large-scale antibody purification before moving to an operational role scaling-up and transferring manufacturing processes between Lonza sites in the UK, Spain and USA. Nick now works in Market Development at Parker domnick hunter where his focus is in bringing Parker’s strengths in Motion & Control to Bioprocessing. This will enable customers to improve the quality and deliverability of existing and future biopharmaceuticals.

High Throughput Process Development Tools at Novartis Pharma

High throughput tools and scale-down models are of increasing interest to the biopharmaceutical industry where they can be used to perform process research quickly and without the expense of running experiments at production or pilot-scales. Technologies are being developed and used to study both upstream processing and downstream processing steps and both will form sessions at BioProduction Europe 2015 ( to be held in Dublin in October.

One field of process research in which high throughput process development tools are particularly applicable is in the generation of process design spaces as part of process characterization activities. Process characterization studies are typically performed during late stage process development in order to develop robust process control strategies for process qualification and commercial manufacturing batches. Biopharmaceutical companies are increasingly adopting Quality by Design methodologies in order to characterize their bioprocesses. This ensures that when operating within their defined design space their manufactured products will have the necessary critical quality attributes for them to be safe and efficacious.

High Throughput Process Development Tools at Novartis Pharma

Jean Aucamp, novartis
Dr Jean Aucamp, Lab Head in Protein Processing at Novartis Pharma AG
says the following of Novartis’ use of high throughput techniques for developing chromatography steps, “High-throughput process development (HTPD) is an approach where experimentation can be parallelized and automated in order to investigate large process design spaces quickly. There are numerous ways to conduct chromatography studies in high-throughput mode. HTPD chromatography refers to protein purification studies performed using miniaturized chromatography columns and a liquid-handling work station. At Novartis this technology is used across all stages of development to support column purification studies and shorten project timelines”.

The challenge high throughput process development technologies present, however, is the extent to which they replicate the performance of the large-scale. If they poorly represent large-scale performance the design spaces generated could potentially be meaningless. Dr Aucamp says the following “It is well known that results obtained from chromatography studies performed in high-throughput mode do not compare directly to laboratory-scale data.” Novartis have conducted a series of fundamental studies looking at the discrepancy in process data between laboratory and high throughput technologies which will be presented at the BioProduction Europe Conference.

Describing the outcomes of the work Dr Aucamp said “A better understanding of the differences that lead to scale offsets was obtained. This understanding allowed the development of an approach where results from the different scales can be better related.”

Dr Aucamp’s presentation “Assessment and Implementation of HTPD Chromatography for Process Characterisation Studies” is scheduled for 5pm on 15th October as part of the Downstream Processing Track.

Have your say

To what extent has your organization implemented High Throughput Process Development? What are they key challenges you have experienced?

Dr Nick Hutchinson

Dr Nick Hutchinson

Join me at #Bioproduction15

Contact me at

Dr Nick Hutchinson has a Masters and Doctorate in Biochemical Engineering from University College London, UK where he focused on laboratory tools for rapid bioprocess development and characterization. He then worked at Lonza Biologics in an R&D function investigating novel methods for large-scale antibody purification before moving to an operational role scaling-up and transferring manufacturing processes between Lonza sites in the UK, Spain and USA. Nick now works in Market Development at Parker domnick hunter where his focus is in bringing Parker’s strengths in Motion & Control to Bioprocessing. This will enable customers to improve the quality and deliverability of existing and future biopharmaceuticals.

Continuous Manufacturing within the Biopharmaceutical Industry

Continuous manufacturing is a key topic within the biopharmaceutical industry at the moment. The topic will be the focus of one of the five tracks at BioProduction 2015 which will be held in Dublin in October (

Why then is continuous biomanufacturing attracting such a lot of attention at the moment? Despite much analysis the answer to this question is in fact not that simple.

Continuous Upstream Bioprocessing

Performing cell culture in a continuous fashion has always made sense when the biopharmaceutical being expressed was vulnerable to degradation within the environment of the bioreactor. Monoclonal antibodies which have historically been key drivers for growth within the industry and are usually relatively stable, but biopharmaceutical companies these days can have diverse pipelines which include other recombinant human proteins including enzymes which can have their quality reduced by prolong exposure to bioreactor conditions.

Another reason for implementing continuous upstream bioprocessing is that it allows higher cell concentrations and product titres to be achieved. In this way, the same amount of product can be manufactured in smaller bioreactors. Utilizing smaller bioreactors reduces the capital cost of the vessel itself but also enables the use of single-use bioreactor technology thereby saving money, time and complexity in the set-up of utilities required to run a stainless steel system. The burden of battling the laws of physics in scaling up to 15,000L or even 20,000L bioreactors becomes diminished although it can be argued it is replaced with the burden of battling the danger of microbial ingress leading to contaminations. Increasing production output of perfusion systems is typically achieved by increasing the number of bioreactors rather than increasing the size. This can allow operational flexibility, not only, as to when product is produced but where. In this way biomanufacturers can choose to produce biologics allow over the world and close to emerging markets as part of global supply chain networks.

Continuous Downstream Bioprocessing

A key driver for the operation of downstream processes in a continuous manner has been the significant increase in upstream product titres that have occurred over the past 15 years. This has created purification bottlenecks that must be addressed to ensure all the product synthesised can be purified. Continuous chromatography technology has come of age in recent years and is being promoted as a viable alternative to traditional batch methods. In industries in which continuous chromatography is more common it increases resin utilization, reduces down time and optimizes consumable costs including buffer components.

Integration of Upstream and Downstream Continuous Steps

Genzyme, in particular, have stressed that for the full potential of continuous bioprocessing to be realised continuous upstream and downstream operations must be effectively integrated. They achieved this by using hollow fibre filtration and the alternating tangential flow system to give a highly clarified filtrate that can feed a continuous operated capture chromatography step. This philosophy of coupling upstream and downstream steps will be the subject of a Knowledge Exchange Roundtable Discussion featuring Neha Shah of Genzyme and Massimo Morbidelli at the BioProduction 2015 event.

If you are simply considering whether continuous bioprocessing will suite your product, you are planning your approach or you are seeking to overcome challenges in the implementation, this is an event you must attend this autumn.

Dr Nick Hutchinson

Dr Nick Hutchinson

Join me at #Bioproduction15

Contact me at

Dr Nick Hutchinson has a Masters and Doctorate in Biochemical Engineering from University College London, UK where he focused on laboratory tools for rapid bioprocess development and characterization. He then worked at Lonza Biologics in an R&D function investigating novel methods for large-scale antibody purification before moving to an operational role scaling-up and transferring manufacturing processes between Lonza sites in the UK, Spain and USA. Nick now works in Market Development at Parker domnick hunter where his focus is in bringing Parker’s strengths in Motion & Control to Bioprocessing. This will enable customers to improve the quality and deliverability of existing and future biopharmaceuticals.

BioProduction 2015 – 4 Conferences, 1 Forum, 1 Exhibition


14-15 October 2015

Citywest Conference & Event Centre, Dublin, Ireland

Benchmark technological developments and explore best practices in biomanufacturing

The Bio Pharmaceutical industry is making a capital investment of approximately $8 billion in new facilities in Ireland, most of which has come in the last 10 years, representing close to the biggest wave of investment in new BioTech facilities anywhere in the world.

Join us at BioProduction 2015 in Dublin this October to see why so much is being invested in this area and to network with leading players within this growing industry!

BioProduction 2015: Europe’s leading and largest event for a comprehensive update on all aspects of large scale biological manufacturing. Providing insights on the latest technologies, upstream/downstream processing, process analytics, the implementation of continuous manufacturing, facility design, flexibility facilities and single use systems to reduce inefficiencies during the bio manufacturing process.

4 Conferences – 1 Exhibition – 1 Congress

  • Conference 1: Continuous Manufacturing
  • Conference 2: Upstream Processing- Production, Development & Analytics
  • Conference 3: Manufacturing Strategy & Technology
  • Conference 4: Downstream Processing

250+ Industry Experts – 130+ Companies Represented – 10 Interactive Discussions

Global panel of senior level industry professionals companies including:

  • Sean McEwen, Vice President, Biologics Manufacturing, AbbVie, Ireland
  • Guy McDonnell, Director of Engineering & EHS, Pfizer, Ireland
  • Trent Carrier, Vice President, Vaccine Technology & Engineering, Takeda Vaccines, USA
  • Roman Necina, Vice President Process Science & Technical Operations, Baxalta, Austria
  • Thilo Henckel, Vice President Manufacturing, Roche Diagnostics GmbH, Germany
  • Lada Laenen, Senior Director; Allston Landing Manufacturing Science and Technologies Head, Genzyme Corporation, USA
  • Lars Dreesmann, Executive Director, Head of Clinical Supply & Transfer, Biopharma Bioprocess & Pharmaceutical Development, Boehringer Ingelheim, Germany
  • Ciaran Brady, Director, Biotech Technical Services/ Manufacturing Sciences, Eli Lilly and Company, Ireland
  • Joe Runner, Manufacturing Technical Specialist, Genentech, USA
  • Weibing Ding, Principal Scientist, Process Development, Amgen Inc., USA

For more information on the 2015 event please visit the event website at