In many respects, this is the worst of times in the world’s scientific community – but it could produce the best of times.
The reason for the duality?
On one hand, researchers in the pharmaceutical, biotechnology, materials and other scientific fields are frustrated that tens of billions of dollars in research and development are yielding so little.
It takes a major pharmaceutical company eight to 12 years, and from US$1.2 billion to US$1.8 billion, to bring a new product to market, according to IDC Health Insights in Framingham, Massachusetts (USA).
The era in which scientists could pick the low-hanging fruit of conquering widespread diseases is giving way to a more challenging period of confronting far more difficult diseases such as Alzheimer’s, and those that affect relatively small populations, including multiple sclerosis (MS), Lou Gehrig’s degenerative disease and rheumatoid arthritis.
The complexity that researchers face as they begin to leverage the human genome, analyze the deepest workings of complex organs such as the heart and brain, or analyze data from personal fitness devices is increasing the complexity by orders of magnitude, threatening to overwhelm scientists with an explosion of data.
Materials research, which touches virtually every product used in the world, is also wrestling with an explosion of complexity and data as scientists learn how to analyze materials at the nano, or sub-atomic, level. “We are facing a lot of the same issues faced by our friends in the pharma industry,” said John Mauro, research manager of the glass research group at Corning Incorporated, in Corning, New York (USA).
A NEW WORLD ORDER
For every challenge, however, the marriage of technology and scientific innovation offers an exciting opportunity. In the life sciences, for example, wearable computing tools are collecting data on not only what patients can observe, but also what they can’t, and that data is free of human error and distortion.
As data is getting bigger, analytics are helping scientists to achieve deeper insight. With increasing precision, scientists can model, simulate and predict how individual molecules will affect human tissue or how carbon graphite will endure in a jet aircraft when exposed to wide variations in air pressure and temperature.
“The biggest game changer in the world is the ability to do high-performance computing,” said Sanjay Mehta, manager of computational modeling at Air Products, a manufacturer of industrial gases and performance materials located in Allentown, Pennsylvania (USA). “It is all about the ability to simulate thousands and millions of scenarios. You can’t go into the lab and perform every experiment. It’s just not possible.”
This ability to design materials on demand is revolutionary. Corning, for example, was able to use data-driven modeling, as well as breakthroughs in predicting and understanding the core properties of glass, to come up with the third generation of its trademarked Gorilla Glass, which is on many mobile telephones and devices, to absorb energy rather than break.
The life sciences have traditionally been slower to commercialize than many sectors of materials research because of difficulties in linking the work of drug discovery to drug development, manufacturing, clinical trials, laboratory testing and, ultimately, review by government regulators.
“The whole idea of how you travel from the lab, whether for consumer products or pharmaceuticals or a medical device, to your consumer or patient in a more seamlessly manageable way is certainly a theme for every industry,” said Paul McKenzie, senior vice president for manufacturing and technical operations of Janssen Pharmaceuticals, the pharma business of New Jersey (USA)-based Johnson & Johnson (J&J).
A RESEARCH MODEL SHIFT
To achieve a new era of dramatically improved innovation, the basic model of scientific discovery and commercialization is experiencing radical change.
For decades, the scientific research community – particularly in life sciences – has been characterized, in simplified terms, by individual scientists or small groups who labor in isolated silos. Successes have been shared but failures have been forgotten, dooming others to repeat them. When a group of scientists does identify a useful compound, the information is transferred to a development team, which in turn “throws it over the wall” to manufacturing. Still other teams manage the clinical trials and seek regulatory approval. All of which generates a certain measure of repetitive effort and lost intellectual property (IP) as each group tries to understand the full context of the previous group’s work.
“You’ve got a dysfunctional model,” argues Bernard Munos, who spent 30 years at Eli Lilly in Indianapolis, Indiana (USA), and is now senior fellow at FasterCures, a center of the Milken Institute in Santa Monica, California.
In the new, improved model, scientists enter all their results – successes and failures alike – in electronic laboratory notebooks (ELNs). That data can be stored centrally and serve as benchmarks for researchers in other projects and other geographies – perhaps even shared with other companies. “Pharma organizations were once very much vertically integrated,” said Andrew Brosnan, a UK-based pharmaceutical industry analyst for research firm Ovum. “We’re seeing the unraveling of that now.”
SILO WALLS BEGIN TO FALL
Today, successful research seldom occurs in a single lab or even a single organization, but through the collaboration of many scientists and developers.
“The reality of our world is moving toward more of a virtual integrated network,” said Janssen’s McKenzie, who is based in Horsham, Pennsylvania (USA). “I may do steps A, B and C internally on a drug entity, but D, E and F may be done externally.”
The work of India’s Council for Scientific and Industrial Research (CSIR) to combat the resurgence of deadly, drug-resistant forms of tuberculosis (TB) could provide a new model for how discovery should proceed. CSIR adopted an open-source R&D model and appealed to scientists around the world to contribute their expertise. Ultimately, 830 scientists volunteered, achieving significant progress in combating the disease in just four months.
“If you put 830 people on such a challenge, you can succeed and you can succeed quickly,” Munos said. “That was 300 man-years of work. No scientist in his right mind would take this as a lifetime project because he would never see the end of it and never get recognition for it.”
AGREEING ON A STANDARD
The TB project clicked because it came in response to a humanitarian emergency. But in the corporate world, creating the systems that will support broader collaboration outside a company, with commercial and academic partners or even across competing organizations, will be more difficult. Who controls which piece of IP is a critical topic. So is the need for common standards and protocols so that players who have never communicated can begin to do so with ease.
Companies in every field of science define their research data differently, making it impossible to compare results. “We all define the data differently,” said Gerhard Noelken, Pfizer’s senior director of technology and innovation based in the UK. “There is not one dictionary for how you describe how you define an analytical result. If you can agree on a proper definition, exchanging data and information between companies and between partners would be so much easier.”
Agreeing on who should set the standards is proving to be a challenge. Major end-users such as J&J are lobbying technology vendors to agree on ways to make their IT offerings “platform agnostic.” If vendors cannot, or will not, the industry risks erecting a new Tower of Babel and creating more islands of “dark data” that are inaccessible outside a single organizational silo.
“We have technologies from many, many companies,” Janssen’s McKenzie said. “But we haven’t found a company yet that has completely got this vision to really provide an approach where other systems can access each other and really partner up. My hope is that vendor collaboration around standards will be a success.”
FROM LAB TO SHELF
The starting point for the new vision of research starts in the laboratory. For many companies, the launch of ELNs was frustrating because some scientists resented the imposition of a new housekeeping chore or felt it exposed their ideas before they were ready for evaluation. In the early going, the software and supporting databases also were not robust enough. “Most of the early efforts didn’t deliver the value that people had hoped for,” said Alan S. Louie, research director of IDC Health Insights.
But increases in raw computing power have helped. For example, at North America’s Merck & Company (MSD), scientists once had to wait 200 seconds – more than three minutes – before the database would admit it could not supply an answer. Years later, however, Merck implemented a database appliance solution that provides answers in about 15 seconds, Louie said, dramatically improving the user experience.
Robert Wade, a Pfizer research fellow based in Groton, Connecticut (USA), said that three departments within the pharmaceutical sciences unit, the company’s pre-commercial development arm, have been linked using an ELN. The company estimates that it saved US$2 million a year when the system was used by only 230 people. Today, the ELN system has 900 users.
“Once the data became available and easily sharable, the fears about using the system went right away,” Wade said. “Everyone demanded access to the data of other scientists working on similar projects.”
END-TO-END VISIBILITY
Like Pfizer, the entire life sciences industry understands the promise of extending these and other IT systems throughout their organizations, into clinical trials, and ultimately linking them with regulatory agencies.
Janssen’s McKenzie, for example, envisions the possibility of using IT systems to demonstrate to the US Food and Drug Administration (FDA) and its global counterparts that products comply with regulatory expectations and are consistent with the materials used in clinical trials. Today, that system is largely paper-based, making it time-consuming and labor-intensive.
“We should be able to do that with the hit of a button instead of having a large amount of paper and reviews and people interacting with documents,” McKenzie said. “I see a large efficiency to be gained.”
PERSONALIZED MEDICINE
Add it all up and the promise of medicine personalized to specific sub-groups of a disease – even to specific individuals – begins to glimmer. The ultimate dream, of course, is that cancer patients could have their genome decoded and their particular variation of cancer analyzed to determine which compound, group of compounds or DNA sequence will kill the disease with minimal side effects to healthy tissue.
In the materials field, scientists see the potential to improve the way semiconductors are made, cars that can be held together by adhesives rather than heavy metal welding and paints that can dry faster with fewer environmental hazards. Despite all the challenges, the lure of major advances is driving the world’s scientific community to make its next great leap forward into, potentially, the best of times. ◆
William J. Holstein is a New York-based business journalist and author. His most recent book is, “The Next American Economy: Blueprint For a Real Recovery.”