An Argument for Change – The Promise of the Next Decade

This article forms part of the CPhI 2017 Annual Report, which will be released during the CPhI Worldwide event in Frankfurt (October 24-26, 2017). http://www.cphi.com/europe/cphi-annual-report

Over a decade ago I wrote an article advocating that industry and FDA truly embrace the 2004 FDA guidance,1 which supported a shift from a documentation - and testing framework for assessing quality and compliance to one predicated on scientific understanding. This fundamental change was to lay the foundation for catalyzing business and technical performance. For nearly half a century the pharmaceutical industry has lagged other industries in terms of productivity, efficiency and, some might argue, innovation. The excuse was that, as a regulated industry, any flexibility to implement improvements is limited, and compliance is of paramount importance. It is incumbent upon us to build safe and efficacious drugs. Both caregivers and patients demand it. Yet caregivers and patients are largely oblivious to the myriad of quality issues that plague our industry. So one could argue that if it were not for the rigorous compliance standards imposed by regulating authorities we would be in a fine mess. Despite the compliance mindset ingrained into our strategic thinking, the bigger question is: Can we foster innovation and drive business performance while guaranteeing drug efficacy and safety, as an industry? For decades, the industry’s answer was no. We were not built for efficiency, speed, nimbleness AND creativity. Our track record for bringing new drugs from discovery to market is abysmal, by design! Several seminal events have forced a change in that paradigm, specifically:

  • the establishment of the emerging markets
  • the growth in adoption in PIC/S
  • adoption of quality risk management
  • the transformative effects of large scale data analytics
  • and the adoption of e-clinical and lifestyle diagnostic and digital health solutions.

Together these factors plot a new course for the industry that will transform not only drug development but also the approach to healthcare on a global basis for the next decade.

Maturation of the Emerging Markets

The establishment and growth of what we once called the emerging markets has transformed the global pharma marketplace. It is unlikely we will see again in our lifetime the enormous economic impact the BRIC (Brazil, Russia, India and China) countries have had on the global economy constituting over 20% of the world GDP.2 Initially the growth of these markets represented cheap, viable labor for the global marketplace as well as a significant opportunity in their own regional markets. Faced with lower labor costs that could potentially drive down the cost of generic drugs by 50%, most large pharma looked to these markets to catalyze business performance in a decade when the global economy was in a tailspin. This in turn drove massive consolidation within our industry, as cutting operating costs became a short-term fix for sagging revenues. However, expectations of quality were not relaxed during the global recession. So while these markets provided a shortterm solution, they also represented a potential for significant quality risk as manufacturers, operating on lower compliance standards, began their foray into the U.S. and European marketplaces. As drug developers extended their footprint, the reality of the paradigm shift required to meet U.S. and European quality standards became more stark. One quality event could wipe out several years’ worth of operational savings or worse, cause quality safety issues that could damage a brand for years, if not forever. Companies that had been burned, or anyone who recognized the risk, shifted their strategy to provide support to markets with a lower regulatory and compliance risk. In the drug development process, this meant shifting away from the short-term concerns of cheaper cost of goods sold (COGS) by pursuing cheaper labor for manufacturing and clinical trial execution for the U.S. and European markets, to focusing on the long-term benefits of harnessing the lowcost expertise for early drug discovery where the GMP compliance tolerance is very high. This new focus on outsourcing early development, centered on Contract Research and Manufacturing Services (CRAMS) that allow big pharma to do rapid molecule evaluation at a fraction of the cost of internal resources and to minimize their GMP exposure.

The global pharma market is expected to top $1.7T in 20173 with a significant contribution to growth coming from the emerging markets. The emerging markets’ growth has given rise to a burgeoning middle class that demands the same quality of drugs as the U.S. and Europe. However, growth has been painful. In China for example, scandals involving drug quality and safety have plagued material supplied by both API and drug manufacturers, highlighting the complexities inherent in bridging the delta between compliance and quality. Despite entering the global market place ahead of China, India still grapples with this concept, with data integrity a core issue for both API and drug products manufactured there. The Ranbaxy sanctions, which revealed institutionalized fraud, underscore the importance of creating a culture that clearly emphasizes the role each element of the drug development and manufacturing process plays in ensuring drug safety and efficacy, while linking the compliance activities and the consequence of noncompliance which underpin these elements. The lessons learned from this investigation should be kept at the forefront of our considerations as we look to leverage the promise of digital health that further integrates the patient into the treatment lifecycle, because the consequences of fraud can have far reaching consequences for both the individual and the marketplace.

In the decade to come the complexity of the global market will continue to increase as global economies evolve. The Next 11 countries – comprised of Bangladesh, Egypt, Indonesia, Iran, Mexico, Nigeria, Pakistan, the Philippines, Turkey, South Korea and Vietnam - are also beginning to mature in terms of market requirements and are moving in the direction of elevated compliance requirements. With this evolution comes more complex regulatory and compliance strategies, as development organizations will have to balance regional and global compliance standards that are not at parity but not so far apart as to demand completely different solutions.

Regulatory Compliance Convergence

Several factors are driving what can be called compliance convergence. Historically, USP and EP established the generally accepted minimum requirements for analytical testing and characterization. The establishment of ICH guidances set best-in-class minimums for all markets. Regulatory disparity represented the greatest hurdle to broad adoption of these standards but that hurdle is rapidly disappearing as a result of the success of the Pharmaceutical Inspection Co-operation Scheme, or PIC/S. Today PIC/S membership consists of approximately 50 regulatory entities, with a projected 70 countries enrolled by 2020. PIC/S’ goal is to harmonize inspection procedures worldwide by developing common standards in the field of GMP. It also aims at facilitating cooperation between competent regional and international authorities and thereby increase overall mutual regulatory confidence. The natural consequence of this is a much higher awareness of the relationship between compliance and product quality. Markets that embrace PIC/S are helping narrow the gap between markets, which benefits everybody, making markets accessible that previously were not.

Risk and Process Understanding

Underlying this convergence is the formalized adoption of risk management as part of the drug development and regulatory framework. Up to now, risk assessment has been informally applied to product development. Device development has mandated it for years with ISO 14971 being a cornerstone of the Design Control process. Even so, the drug industry has only recently begun to embrace risk as an essential component of the technical development and regulatory strategy, and as a core element of an effective quality management system (QMS). Many regulatory bodies mandate the use of risk assessment tools as part of a regulatory package. It is the adoption of risk management elements throughout the development and commercial process that has been a key factor in the industry’s ability to move away from a documentationand traceability focused framework for quality to one that is based upon product and process understanding. To illustrate the extent to which risk management has become a foundation of today’s regulatory thinking, Japan, considered by many as one of the most conservative regulatory entities, deregulated its controls related to regenerative medicine in 2015 allowing programs to commercialize after Phase 2 clinical trials. Embracing the new trend toward “real-life data,” the need to conduct a Phase 3 trial can be waived depending upon post-market vigilance.

Legal Environment

Significant developments within the legal environment are shaping the future strategy for drug development. In the U.S., the Supreme Court ruled in 2013 that human gene sequences cannot be patented unless they have been manipulated or modified and that no molecule that is a “product of nature” can be patented. The result of this decision reversed more than 4,000 existing patents and made these gene sequences available to research and commercial entities. Most recently the Supreme Court has paved the way for biosimilars to be introduced more quickly to the marketplace by not requiring them to have to wait six months after FDA approval of the branded drug to introduce a generic product. Perhaps the legal initiative with the greatest potential to impact the industry is the 21st Century Cures Act.4 This act, signed into law at the end of 2016, allocates over $6B in grants for research. However, one of the most significant and controversial components of this law is that it provides an option for new drug and device manufacturers to demonstrate safety and efficacy using real life data as opposed to randomized clinical trials. This has the potential to reduce the cost of development and speed new and innovative therapies to the marketplace. We are already seeing an impact in the device world as new diagnostic therapies are being approved based upon a hybrid of real life data and small controlled trials. How this will evolve in the next decade should be watched closely as global data privacy laws become more of an issue. If this can be done effectively without compromising patient safety, then the cost and complexity associated with clinical development will be changed forever.

Innovation Fueled by Data

Regulatory philosophies and industry thinking have both had to change to realize the potential of new emerging drug therapies and technologies. The importance of data and its role in demonstrating both performance and compliance has increased over the last decade. The future seems to indicate that complexity and value of data will increase over the next decade. For example, innovations in next generation gene sequencing (NGS) allow physicians to confirm their patients will respond effectively to a drug therapy by verifying the genetic basis for their disease state. Each run can generate between 200,000 to 500,000 data points to be analyzed, based upon our findings from the Thermo Fisher NGS system, recently approved by the FDA as a companion diagnostic. It is clear that data volumes at this level require a very different approach to both verification and compliance. Complicating the matter however is the fact that routinely only 93% of the data is accurate. So by design 7% is erroneous! Endorsing a process that generates bad data by design is something completely foreign to both the FDA’s and the industry’s thinking. However, what bodes well for the industry’s future is the willingness by the agency to embrace these paradigm-breaking technologies in order to realize their ultimate benefit.

Generating huge quantities of data creates challenges as organizations adapt to manage petabytes of data while ensuring data integrity, and compliance. Another problem for both developers and regulators concerns how to validate an assay that routinely generates erroneous data. Complicating the matter is the fact that the assay controls can be based on up to 200-300 custom oligonucleotide chains. It did not take long for the FDA to recognize that the conventional thinking of asking for process validation around each oligonucleotide chain manufacturing process was not viable. Personalized medicine and Antibody Drug Conjugates challenge the conventional definitions of product and process design, and quality. The landmark guidance issued by the FDA1 promoted a shift in thinking regarding process and product quality to a framework based upon scientific insight. However, it was not until these new technologies were mature enough to commercialize did the agency and industry come to grips with the reality of this new thinking.

The Rise of Analytics

There is a clear trend amongst industry and regulators toward analytics frameworks. The FDA issued its guidance on Quality Metrics for the Drug Industry in 20165 in part as recognition that an organization’s QMS is comprised of discrete processes. High performing processes require a clearly articulated governance structure. Industry may have been slow to respond to the guidance but, buoyed by a decade of operational excellence initiatives, it is not completely unaware of the benefits of measuring performance through the use of Key Process Indicators (KPIs). This long-simmering awareness has grown into several significant initiatives. The first is Cloud computing. The FDA has already provided guidance regarding its expectations regarding Cloud computing compliance. This trend will gain ground with the steady increase in utilizing contract service providers for development, testing, and manufacturing. The need for the ability to access and utilize data has never been greater, given the global nature of the supply chain.

Another driver is the Drug Quality Security Act (DQSA), enacted in 2013. This legislation is intended to create a comprehensive traceability across the entire supply chain. 2017 was to be the year manufacturers had to implement unit of sale level traceability. The FDA has pushed compliance off until November 2018 providing a slight reprieve. However, the technical challenges associated with data gathering, retrievability, and with mandatory reporting will require a complex architecture, particularly when using a third-party contract service provider. The challenge for the next decade will be developing a practical architecture for data management that will support the clear trend for outsourcing and virtual drug development. Multi-tenant architectures have been proposed as a solution for CMOS and CSPs to be able to accommodate multiple customers without breaching privacy and data integrity. My experience with most CMOS is that their internal sophistication is geared more toward their lowest common denominator, and IT innovation is not on the horizon for most organizations. However, I believe virtual manufacturing is here to stay, which means we will be seeing a new division of labor and partnership in terms of strategic and tactical planning where the drug sponsor will take a larger role in the CMOS’ infrastructure development.

There are drug companies that have shifted completely away from vertical product development and manufacturing and have concentrated on how best to utilize contract service providers as a surrogate for internal staff. Central to this strategy is the need to acquire and share data internally with the drug sponsor, but some manufacturers are evaluating whether it is possible to share data among service providers to drive performance. Look for these solutions to mature over the next decade as the impact of reliable quality and manufacturing data will become central to the virtual development and manufacturing paradigm.

Big Data Analytics and Patient Engagement

Much has been written about the portent of Big Data Analytics, particularly as it pertains to gathering and leveraging data for e-commerce. Not surprisingly, Big Pharma and biotech have lagged in pursuing Big Data, but Big Data has caught a foothold in R&D. One of the challenges our industry has always faced is the time and money required to bring a new drug therapy through the development process only to stumble in Phase 3 clinical programs. Today there are organizations like PatientsLikeMe that serve as portals for patients, developers, and researchers to share in the treatment of the disease state. These services provide a never-before-tapped resource for developers to understand patient preferences that can affect adoption, contraindications, alternative therapies, etc. The ability to manage and harvest this data becomes the basis for the Big Data initiative. Pharma companies are already mining resident data derived from their QMS, manufacturing, and development history with an eye to identifying better drug candidates faster.

Digital Health

Patient engagement is already changing through the use of lifestyle software and IoT devices. Genetic characterization services such as 23andMe now allow patients to research DNA markers and proactively drive their own healthcare program. Laboratory Developed Test (LDT) service providers can provide patients and healthcare providers with benchmarks and metrics regarding where a patient sits in the continuum of their health. This data can be used to drive lifestyle choices from dietary choices to pre-emptive elective surgery. Clinical trials use wearable monitoring solutions as part of their data gathering architecture. The FDA’s guidance on Software as a Medical Device (SaMD) provides the framework for utilizing these technologies. Such solutions are having a huge impact on patient engagement and tempering clinical trial drop off. More accurate data acquisition can fuel significant innovation in clinical trial design, and SaMD applications will only expand as the technology continues to evolve.

The Rise of e-Clinical Platforms

The use of e-clinical platforms has changed the context around conducting clinical trials, with low cost, highly reliable solutions for gathering data. Regulatory requirements for safety monitoring are increasing the amount of study data that must be gathered, organized, and analyzed. Innovative e-clinical technologies will help manage data requirements, reduce development costs, support faster “Go/ No-Go” decisions for potential new products, and increase efficiency throughout the clinical trial process. These systems can handle the trial supply chain logistics, manage clinical program documentation, provide electronic data capture, and can handle patient self-reporting constructs. The result is lower cost clinical trial management, data acquisition, and reporting with a higher level of data fidelity than manual systems. For regional clinical trials, such technology allows drug sponsors to circumvent the variation in GCP rigor that previously dogged clinical trial data in certain markets.

Drug development and clinical trial management are poised to realize unprecedented changes in efficiency and effectiveness prompting innovative new approaches to clinical trial design. Patient centricity is emerging as a new trend in clinical trial design in which the patient is included in the design of less burdensome studies, optimizing study protocols’ inclusion/exclusion criteria to enroll more patients, and creating a more engaging and convenient clinical trial environment for the patient.

Cybersecurity and Data Compliance

As Pharma continues to expand its dependence on data as a foundation for drug development and clinical trial management, having a nimble and robust cybersecurity program will become ever more critical. Intelligent devices being used for data acquisition will pursue FIPS 140-2 certification. Currently, cybersecurity is a lagging function, meaning it is not the primary consideration when designing and implementing an information management system.

Exacerbating the risk of data exposure is a renewed emphasis on data privacy. Although the U.S. has opted to take a step back in terms of data privacy with its recent decision to allow ISPs to sell personal data without consent,5 the rest of the world is moving to greater sanctions for compromised privacy. The European Union’s General Data Protection Regulation (GDPR) is slated to go into effect by May 2018 and promises fines of 20 million euros, or 4% of a company’s worldwide turnover, for GDPR privacy breaches. One of the provisions requires a 72-hour notification period to individuals impacted by a breach. This has spawned a whole new nefarious industry called “Ransomware as a Service." Ransomware as a Service seeks to attack companies that are regulated under GDPR and threatens to encrypt their data beyond the 72-hour period, exposing organizations to GDPR’s onerous fines. Given the constantly evolving threat profile of cybersecurity, it will not remain in the background of an organization’s strategic initiative in the next decade. Look to cybersecurity approaches to harmonize and move to the forefront of many organization’s strategic plans.

Conclusion

There is no doubt we have moved ahead tremendously since the 2004 FDA guidance was first issued. Subsequent regulatory guidance by the FDA regarding process validation and Annex 15 by the EMA have forced the industry to drive toward Quality by Design (QbD) as a basis for product development. New technologies have transformed our classical framework for defining compliance and demonstrating control and product performance. As we move to the next decade these foundational changes will allow regulators and industry to take advantage of the incredible innovations we will see in healthcare. Our investment in handling big data will allow us to mine lost nuggets of insight into our historical performance and allow us to acquire insight from e-clinical and post-market vigilance initiatives more effectively. As PIC/S gains acceptance we will as a global marketplace look to these capabilities as well, narrowing the gap in quality in health care between the emerging and Next 11 countries, and the U.S. and Europe. Even as Brexit looms in Europe, it will be very difficult to move backward in how we measure quality, as the pressures to realize on business performance will not cease. With the broad adoption of digital data acquisition and analysis, the industry’s ability to manage the constantly evolving threats to data management and integrity will define how we develop new drug therapies in the future.

References

  1. FDA Guidance for Industry: Pharmaceutical CGMPS for the 21st Century A Risk Based Approach, https://www.fda.gov/drugs/developmentapprovalprocess/manufacturing/ questionsandanswersoncurrentgoodmanufacturingpracticescgmpfordrugs/ucm071836.htm
  2. BRICS to continue growing momentum, contributions to world, Global Times, http://www. globaltimes.cn/content/1011813.shtml
  3. IMS Institute projects global pharma market of $1.17-1.20 trillion in 2017, Pharmaceutical Commerce, http://pharmaceuticalcommerce.com/business-and-finance/ims-instituteprojects- global-pharma-market-of-1-17-1-20-trillion-in-2017/
  4. H.R.34 - 21st Century Cures Act, Congress.gov, https://www.congress.gov/bill/114thcongress/ house-bill/34
  5. B. Fung, What to expect now that Internet providers can collect and sell your Web browser history, the Washington Post, https://www.washingtonpost.com/news/the-switch/ wp/2017/03/29/what-to-expect-now-that-internet-providers-can-collect-and-sell-yourweb- browser-history/?utm_term=.46024316f1b0
  • <<
  • >>

Join the Discussion