The Importance of Effective Data Capture and In-Lab Tech for Bioanalytical CROs: Implications for Quality and Customer Satisfaction

Asked about obstacles that make them second-guess outsourcing work to bioanalytical contract research (CROs), pharmaceutical companies do not mince words: “regulatory compliance and scientific expertise,” and “unresponsiveness and imprecise data”1 are among the more cutting remarks. This is not what prospective customers need to hear. In a highly competitive industry where compliance, quality, and integrity are critical to winning and retaining business, pharma sponsors must have high confidence in their CRO partners.

Meanwhile, early drug discovery and preclinical testing remain among the most labor-intensive phases of the drug development cycle. With just one in 5,000 compounds that enter preclinical testing progressing to clinical trials, rigorous processes and expertise that effectively orchestrates these phases is essential. This crucial capacity, both specialist and labor intensive, is fueling the current rise in the global preclinical CRO market, which is expected to reach $7.8 billion by 2027, registering a CAGR of 8.3% over the next seven years.

Bioanalytical CROs able to drive effective bioanalytical workflows, from sample preparation to analysis and auditing, stand to gain the most over the forecasted growth period. But there are many factors that determine the value of CRO partners, and the solution to improving each of them lies in technology, especially in the Quality department.

CROs themselves face challenges: an explosion in sample volume, a cost-sensitive market, tight turnaround times, an increasing diversity of assays, and a need to demonstrate regulatory compliance and build their credibility. They have encountered a market-driven demand, simply put, to do more, for less, faster, and more reliably.

Within the CRO, priorities further vary, depending on the point of observation. Bench scientists seek tools to refocus their work on method execution, managing or eliminating deviations, and improving collaboration. IT leaders naturally emphasize digital transformation with demonstrable ROI, in particular where a more efficient use of resources can be gained. And executive management is bottom-line focused, seeking means to minimize cost and maximize yield while reducing need for additional equipment or headcount. They’d like to reduce project times and invoice faster as well.

The Importance of Effective Data Capture and In-Lab Tech for Bioanalytical CROs

A software approach further recognizes that, typically, people aren’t the problem – in fact they’re a CRO’s greatest asset. Attempting to push the compliance burden to people and paper (or spreadsheets), while appearing to require the least change management and outlay, in fact cripples a research operation. The opposite of the desired outcome results. Pushing audit trail construction, QC reporting, directly impacts sample throughput and introduces uncertainty, compared to what can be achieved in digital systems. It is rarely billable work and it is time consuming.

April Pisek, an experienced bioanalytical science in the contract research space and solutions consultant at IDBS, expanded on what a multi-layered quality system can do for confidence in data and offers a five-step model to drive expectations: “First, engineer quality-by-design before the data is even entered into the system, reducing manual, subjective, and laborious QC processes. Second, build detailed audit trails that are not only thorough, but user friendly and human readable. Third, provide the ability to reconstruct data in an instant to get to insight or impact assessment. Next, design a validated ability to aggregate information from the source of truth thus eliminating QC review. Lastly, design a data model that has intention and purpose to serve Quality teams while allowing remote auditing from anywhere in the world.”

Operationalizing quality-by-design is a further expectation to put on software. Today’s best systems blur the line between electronic lab notebook (ELN), method execution, sample management, and quality management. As such, method execution support is a baseline capability against which to evaluate solutions. This introduces consistency in execution, supports onboarding and training, and permits flexible use of staff across projects. User qualification and conditions for material handling can be similarly enforced. As such, the software supports knowledge transfer, both internally, and ideally from sponsor to CRO, a known and frequent pain point. Parameterizing processes in this way helps CROs ask the “right questions” to achieve consistency. Finally, data governance can be operationalized in software.

Subscribe to our e-Newsletters
Stay up to date with the latest news, articles, and events. Plus, get special offers
from Pharmaceutical Outsourcing – all delivered right to your inbox! Sign up now!

This last point about data governance is critically important. Organizations leaning in on data strategy understand the key role of governance. They convene data governance groups to debate and set standards for adoption, and perform data stewardship functions. However, data governance that heavily relies on individual compliance does not tend to work out well. When business rules are set and enforced in software (or best of all applied at the moment of data acquisition in a thoughtfully-designed integration), data requires no further manipulation to be “reporting-ready.”

No discussion of this type should be complete without an accounting of the efficiency gains yielded by adoption. Fortunately, they can be tied to specific tasks in the bioanalytical workflow. Non-bench tasks relating to data management are consistently shortened (see Figure 1). This technically enabled workflow includes the integration of reagent and buffer preps, the broad adaptation of barcodes for consumables, reagents, and instruments, the capture of deviations from method at the point of entry, and documentation in realtime. Real-time equipment verification incorporates compliance and prevents common errors such as capturing the use of incorrect materials or equipment.

Study timeline impact of a software-mediated method execution and quality process

In the paper scenario, challenges drawing the study time to 49 days in the small molecule analysis workflow examined can be reduced to two principal root causes: difficulty in locating and discerning data for inclusion in a report or a retrospectively constructed audit, and processes with a dependency on a single individual. The software-mediated counterpart yields a 75% reduction in QA overhead, or thirty days of work in the small molecule analysis illustrated.

CROs today seek competitive advantage in a globally competitive market, facing the challenges outlined above. This can be described as “What a Winner Looks Like.” When seeking to attain the advantage that well-architected software offers, CRO leaders can expect to win by:

  1. Increasing the number of studies and samples tested, opening potential to take market share.
  2. Removing the need for new people and equipment to meet demand, controlling operating costs.
  3. Tightening study turnaround time and proving quality of results, exceeding sponsor expectations.
  4. Reducing the effort and time to demonstrate regulatory compliance in practice.
  5. Building credibility by powering the business with industry-recognized software and services: a “halo effect”.

We’ve only briefly mentioned instrument integration as a feature of this data capture strategy. To be clear, integration is a pillar of any modern scientific informatics solution. Historically, tools in the ELN class have focused on data capture by any means, be it unstructured, structured, harmonized with other data, or consistent with a broader standard – anywhere, along a data maturity model. This is the general concept of “paper replacement” or “paper-on-glass” ELN. Untold exabytes of semi-dark data exist in this state, awaiting remediation or transfer to another database structure which may (or may not) permit it to become the source of further insight.

Today’s scientific informatics customer expects much more in terms of the capture and presentation of data that can provide not only the basis for operational insight and reporting, but strategic insight that offers process understanding and enables kaizen. CROs that have this kind of relationship with their sponsors evolve from service provider to full, trusted partner.

While this article focuses on the challenges and benefits to quality processes at a bioanalytical CRO, the core messages may be transferred and amplified in the contract manufacturing organization (CMO)/contract development and manufacturing organization CDMO) spaces. There is real urgency to obtain not only operational and regulatory support from software systems in manufacturing, but the kind of strategic insight that supports digital twinning and process optimization.

When workflow design anticipates integration and quality needs – what Pisek described as “intention and purpose” of the system – then the data presentation for reporting and greater insight is much less burdensome. This is probably the most important learning of the 2010s in scientifi c informatics, and lead to a pivot toward the uses of data downstream from the moment it is acquired. For so long, these systems were about data capture, period. The most forward-thinking players in our industry think, where does the data go next?

References

  1. Spooner N, Cape S, Summerfield S. Outsourcing strategies in bioanalysis. Bioanalysis. 2017;9(15). Accessed online.
  2. Institute of Medicine (US) Committee on Confl ict of Interest in Medical Research, Education, and Practice; Lo B, Field MJ, editors. Confl ict of Interest in Medical Research, Education, and Practice. Washington (DC): National Academies Press (US); 2009. E, The Pathway from Idea to Regulatory Approval: Examples for Drug Development.

Graeme Dennis is the Commercial Director, Preclinical Pharma at IDBS since 2018. Prior to IDBS, Graeme held scientific informatics leadership roles in academia and industry, including Accenture and Vanderbilt University, where he studied Chemistry (B.S. 1999). Graeme is interested in systems that help organizations position scientific data as an asset for operational and strategic use. He lives in Nashville, Tennessee and enjoys music and hiking.

  • <<
  • >>

Join the Discussion