By establishing effective management of data and knowledge, it will be possible to employ advances, such as artificial intelligence and synthetic biology, to their full potential in upstream bioprocessing.
By Claire Hill
The biopharmaceutical market continues to expand at a rapid pace and the demand for upstream manufacturing capacity, particularly for growth areas such as viral vectors, is growing with it. The increasing popularity of single-use bioreactor systems and continuous processing techniques have brought both new opportunities and challenges to upstream bioprocessing. Together, these trends have the potential to transform the industry, but advances in both instrumentation and lab informatics are needed before the full benefits can be realized.
With the exception of a few “blockbuster” monoclonal antibody products, which are better suited to large-scale stainless-steel bioreactor production, most biopharmaceutical companies are now increasingly implementing single-use bioreactor systems in the range of 1000 L–5000 L. The key benefits of this approach are reduced manufacturing facility footprint and upfront investment, faster changeover between batches, reduced risk of cross contamination, and greater flexibility in batch size and facility configuration. Even companies such as Genentech, who own some of the world’s largest manufacturing facilities, are now embracing this new way of working to better fit the increasing number of small-volume products in their pipeline (1). As a result, according to a report by Research and Markets, the global single-use bioprocessing market is expected to reach approximately $7 billion by 2024, growing at a compound annual growth rate (CAGR) of more than 14% from 2019–2024 (2).
Continuous processing techniques are another factor contributing to the popularity of smaller, single-use bioreactors. Although the industry as a whole has been slow to adopt these techniques, FDA actively encourages continuous production, and companies such as Sanofi and WuXi Biologics have made significant investments in continuous manufacturing facilities. The potential to dramatically increase productivity is a strong incentive: WuXi Biologics has reported a tenfold increase in productivity with their WuXiUP continuous perfusion-based processing platform compared to traditional fed-batch, which enables 1000-L single-use bioreactors to produce the same amount as 10,000-L stainless-steel bioreactors (3). This higher productivity enables a scale-out approach with multiple same-size bioreactors rather than scaling up to larger bioreactors for commercial production, meaning that the same equipment and process conditions evaluated during development can be used for production, which simplifies technology transfer and reduces risk. Increased productivity and capacity are especially important for therapeutic areas with a shortage of supply, such as viral vectors, and new advanced therapies such as allogeneic chimeric antigen receptor T-cells (CAR-T).
Although the drivers for adoption are clear, single-use systems and continuous processing are not without their challenges. Single-use systems involve a greater reliance on the quality and robustness of the vendor’s supply chain, and concerns about potential problems caused by extractables and leachables, especially during downstream operations, need to be addressed through appropriate studies. The requirement for greater oxygen transfer in high-density cultures and minimal shear stress on delicate cells pose technical challenges for single-use bioreactor design, and vendors are responding with products such as the HyPerforma DynaDrive Single-Use Bioreactor from Thermo Scientific and iCELLis fixed-bed bioreactors from Pall. With a scale-out rather than scale-up approach to meeting production requirements, there is a greater risk of variability in process conditions and product quality between individual bioreactors. Bioreactor monitoring is also considerably more complex with continuous processing. Compared to traditional fed-batch processes, which typically require at-line and off-line sampling only once or twice a day over a period of a few weeks, continuous processing requires many more data points, ideally using real-time data as much as possible. The common factor across all of these considerations is the need for better process understanding and control to ensure that quality objectives are met.
Advances in analytical instruments are one key part of the solution. Novel techniques including Raman spectroscopy and other advanced monitoring methods have the potential to reduce the reliance on discrete sampling points for off-line analytics such as titer measurements and improve the ability of real-time analytics to enable fully automated process control (4). Other innovations include single-use sensors such as the Rosemount 550pH sensor from Emerson and Internet of Things (IoT) enabled devices. IoT enabled devices can provide automated preventive maintenance and just in time (JIT) delivery of reagents, automated data collection, and analysis, and better integration with control systems, such as supervisory control and data acquisition (SCADA) software. The ability to monitor instruments and processing equipment in real-time can also help avoid costly downtime. As an example, a lab in Massachusetts used a cloud-connected temperature sensor from Elemental Machines, a Cambridge, MA-based provider of sensors and software, to successfully troubleshoot a problem with a high performance liquid chromatography instrument after traditional methods failed to identify that the problem was caused by the building’s climate control system (5). High-throughput techniques such as micro-bioreactor systems are another enabling technology, which, in combination with design of experiments (DoE) and multivariate data analysis (MVDA), support a quality by design (QbD) approach to process development. These approaches can generate meaningful process data faster than conventional methods. Without an effective way to manage and interpret that data, however, the advantages are lost.
The inability to effectively manage process data and knowledge across the product lifecycle is arguably one of the biggest hurdles to progress across the industry as a whole. While this may seem surprising given all the recent advances in informatics solutions, such as cloud computing, the reality is that the problem is as much cultural as it is technical. In many companies there are significant organizational barriers between research, development, and manufacturing departments. Even within departments it’s common for groups working in different locations to use different systems and working practices, which makes collaboration difficult. Resistance to change is another issue, particularly for groups brought together through mergers and acquisitions, along with each group’s desire to maintain control over their own data.
Technical challenges are part of the problem too. Systems designed for research and development, such as electronic lab notebooks (ELN), tend to be fundamentally different compared to systems designed for manufacturing, such as manufacturing execution systems (MES). Researchers and development scientists need the flexibility and the freedom to design and run different types of experiments as well as make changes to planned conditions as projects progress. Good manufacturing practices (GMP) manufacturing, on the other hand, requires strict adherence to specifications and procedural controls. As a result of these fundamental differences, knowledge transfer between groups can often be reliant on static documents, such as Microsoft Excel, PowerPoint, and Word.
Given these reasons, it isn’t surprising to see why barriers to knowledge sharing and collaboration have been so hard to overcome, and how differences in terminology and the way data is captured can perpetuate.
Despite this, there’s huge value to be gained by having continuity in data capture across the development lifecycle, particularly to facilitate technology transfer and gain the process insight necessary for automated control strategies. For example, development teams can learn from seeing the performance of their processes at commercial scale, and manufacturing teams have access to experimental data from development to help troubleshoot production issues. Data access is also critical for other aspects of process insight, such as understanding the impact of upstream changes on downstream performance. What’s needed is a common informatics platform that facilitates collaboration between groups and provides the right balance between flexibility and procedural controls according to the relevant stage of the product lifecycle. One such solution is the IDBS Bioprocess Solution, which was designed to meet this need by focusing on the core objectives of each group and automating data entry whenever possible. The lack of standardization in both instrumentation and data management is still hindering progress in this area, however.
Standardization for instruments involves both communication protocols and analytical data formats. Communication protocols are less of an issue because most bioreactor control systems and some analytical instruments support the Open Platform Communications interoperability standard for integrated process control and there are a number of middleware options to close the gap for the others. The bigger problem is the wide variety of analytical data formats. Many analytical instruments use proprietary formats for data files that require an additional step to export into a readable format, such as a text file. Even when non-proprietary file formats are available, the lack of standardized metadata makes automated data access a challenge. Initiatives, such as the AnIML open standardized XML data format and the Allotrope Framework and Data Models, have raised awareness of the issues and demonstrated a possible path forward toward true interoperability, but the resulting standards have yet to be widely adopted.
The concept of standardization for general data management can be relatively simple, such as ensuring the use of consistent date formats or units of measure, but can also be extended to include more complex ontologies, such as hierarchical relationships and vocabularies, which help make data interoperable both within an organization and with publicly available data sources. Although there are a wide variety of standards and frameworks developed for the biopharmaceutical industry, adoption is variable, and, in some cases, multiple standards overlap the same scientific domain. In addition, it’s even possible to technically adhere to a standard but still not gain the full benefits of interoperability. For example, the ISA88 and ISA95 standards from the International Society of Automation were designed to support manufacturing communications both across the plant floor and with other enterprise applications and are widely used throughout the industry. Many companies have even looked to use these standards as a framework to model their processes and facilities to improve technology transfer and enable better capacity planning. As illustrated by the BioPhorum Operations Group’s automated facility report team, however, ISA88 and ISA95 provide standard structure but not standard code, and this can lead to significant differences in interpretation and implementation, even within the same company (6).
Ultimately, the goal of increasing automation for both data analysis and process control relies on the ability of machines to make sense of a highly complex data landscape. Many companies are now looking to adopt the findability, accessibility, interoperability, and reusability (FAIR) guiding principles for scientific data management across their organizations, and these principles are actively promoted by the European Commission. First published in 2016, the four foundational FAIR principles apply to both human-driven and machine-driven activities and are particularly important for advanced analytics, such as artificial intelligence and machine learning (7).
Although interest in FAIR principles is high, adoption is still slow due to the cultural and technical challenges mentioned previously as well as the lack of a comprehensive data management and data governance plan in many companies. One of the most common objections is the fear that the openness promoted by the FAIR principles is incompatible with intellectual property protection. This is not the case, however, as privacy and security requirements can still be met when implementing FAIR principles as long as the required access and authorization protocols are clearly defined and based on open standards (8). In response to these and other concerns, the Pistoia Alliance, a global not-for-profit members’ organization, is working to provide a FAIR toolkit to help support the implementation of FAIR data management within the life science industry (9).
In summary, the good news is that despite all the current challenges, the future is bright. Once a solid data and knowledge management framework is in place, significant advances, such as the use of artificial intelligence and synthetic biology, will be possible, and the full potential of single-use systems and continuous processing techniques to transform upstream bioprocessing can be realized. This requires enabling technology such as advanced instrumentation but also, more fundamentally, an increasing focus on the importance of data governance and a change in mindset to encourage more open collaboration across the industry. Then, innovations in manufacturing can finally keep up with the scientific advancements of recent years and the commonly accepted time and cost of bringing new products to market can be dramatically reduced.
References
1. D. Stanton, “Genentech on Tech Transfer: ‘Small Volumes are All a Bit New to Us’,” BioProcessIntl.com, March 20, 2019.
2. Research and Markets, “Single-use Bioprocessing Market – Global Outlook and Forecast 2019-2024,” ResearchandMarkets.com, November 2018.
3. WuXi Biologics, “WuXiUP—Continuous Biomanufacturing Platform with Ultra-High Productivity,” accessed Jan. 20, 2020.
4. J.D. Goby, et al., “Control of Protein A Column Loading During Continuous Antibody Production: A Technology Overview of Real-Time Titer Measurement Methods,” BioProcessIntl.com, Sept. 21, 2019.
5. A. Olena, “Bringing the Internet of Things into the Lab,” The-Scientist.com, June 1, 2018.
6. BioPhorum Operations Group, Biomanufacturing Technology Roadmap: Automated Facility, July 2017.
7. M. D. Wilkinson, et al., Sci. Data3:160018 (2016).
8. J. Wise, et al., Drug Discovery Today24 (4) 933–938 (2019).
9. . A. Smith, “Pistoia Alliance Launches FAIR Implementation ‘Toolkit’,” PharmaTimes.com, July 17, 2019.
About the author
Claire Hill*, claire.hill@idbs.com, is Solutions Support manager at IDBS.
*To whom all correspondence should be addressed.