Connect with us

The Future

Otsuka: Real-Time Data Changes Trial Paradigm



At any given time, Otsuka Pharmaceutical Development & Commercialization, Inc. (OPDC) has between 60 and 100 ongoing clinical trials. Like many pharma companies, Otsuka is highly outsourced.

As the Otsuka VP of Applied Innovation and Process Improvement, Debbie Profit spends a lot of her time thinking about the future of clinical trials. “There are changes occurring in the industry that will force clinical development executives to shift their mindset,” says Profit. “I am trying to understand what clinical trial sites will look like going forward. I think we also need to change our vision of where and how we will leverage CROs in the future and think about whether CROs as we know them today will even be in our future. Ten years from now, I am not sure if the current CRO model will exist.”

Technology is changing the paradigm of clinical trials. At the same time, companies are trying to look at clinical trials through the eyes of the patient, caregiver, and other involved stakeholders. This convergence is changing how Otsuka looks at trials. For years, the company viewed technology as a tool to create trial efficiencies. Today, the company is focused on how technology can enhance the experience of patients.

Otsuka has a heavy focus on serious mental illnesses. Approximately 75 percent of the company’s business is in mental illness and central nervous system. Profit notes it’s an area that many companies have abandoned, but one where Otsuka is dedicated to making an impact for patients.

Unfortunately, professional help for patients can be difficult to acquire. Today, there are not enough medical professionals to fill the growing demand for mental health treatment in the U.S. The number of psychiatrist job postings increased by more than 97 percent between 2010 and 2014. The problem is so bad that many patients no longer have access to treatment. The report found that 60 percent of counties in the U.S. (and 80 percent of rural areas) do not have a single psychiatrist. It now takes mental health patients an average of 25 days to get an appointment with a psychiatrist.

“Community mental health centers are grossly under-funded and under-resourced,” states Profit. “They have very little funding for growth and innovation. I worked at a community mental health center for seven years, and today that facility in New Hampshire looks exactly as it did when I left there 26 years ago. I went there in 2012 to speak to the staff about digital medicine and was literally sitting on the same furniture.”

Medication adherence is an even bigger problem. Adherence is a problem across all patient groups but is lowest for patients with mental illness. The Mental Illness Policy Organization notes the failure of individuals with schizophrenia and bipolar disorder to take prescribed medications (usually antipsychotics and/or mood stabilizers) is one of the most serious problems in psychiatric care. Non-adherence can lead to relapse of symptoms, rehospitalization, homelessness, incarceration, victimization, or episodes of violence. The same report found more than half of these patients do not take their medications because they do not believe they are sick.

Further contributing to the problem is a clinical trial process that is antiquated and not patient centric. Profit states clinical trials today are conducted no differently than they were 26 years ago when she left the clinical setting. Most trials are still paper-based and involve little technology. When she was in a clinical setting, Profit provided informed consent forms to hundreds of patients. When she moved to a sponsor company, she contributed to writing many informed consent forms. “My sister passed away from ovarian cancer 15 years ago,” says Profit. “The informed consent form she received was 18 pages long and difficult to comprehend. Today, according to Johns Hopkins, the average consent form is even longer – about 22 pages. We are not making progress, and we must do better.”

Technology May Be The Answer

The challenges with psychiatrists and clinical trials are what got Profit to start looking at technology for answers. She believes greater use of technology is necessary for the industry to transform clinical trials. For example, she notes up to $30,000 could be spent on EKGs in a 20-person trial that spans just eight weeks. Therefore, why not spend an additional $499 to give each patient an Apple watch and keep them better engaged with the study? Not only would the study subject be more engaged, but the watch would allow researchers to gather more data from sensors that read to a wearable device, upload data to the cloud, and then provide information to the patient or to a physician.

“Think about that parent sending a child with depression off to college,” says Profit. “All they want to know is that their child is taking their medicine. Physicians have told me how great it would be simply to know that a patient is taking their medicines. If a patient has a relapse, their doctor will want to know if the medication didn’t work or if it wasn’t taken. We now have technologies that can inform that.”

The real beauty of data from wearables is that researchers can receive that data in real-time or near real-time. Researchers can receive terabytes of data from clinical trial patients throughout the day. Without wearables, Profit laments the lengthy data collection process:  the data is collected at the clinical site, sent to the CRO, then to the data management folks, then to a biostats team, and finally to the clinical operations team. “That process can take six or eight weeks,” says Profit. “If I want to know how the patients are doing in a trial, we need to see the data in real time.”

Think About The User Experience

Profit also believes clinical trial participants need to have a better user experience. For example, for a clinical trial that included the development of a smartphone app, Otsuka needed to apply user experience testing of the app with the patient and caregiver at every visit; a very non-traditional methodology of collecting consumer feedback in a clinical trial. Initially there was apprehension by the clinical trial team, who felt the company’s compliance and legal colleagues would never allow it. These were patients with serious mental illnesses, and there were risk factors and rules that applied, particularly around informed consent and protected health information. But the team stood firm and worked with those colleagues to find acceptable solutions. “We needed to do this,” she said. “This is our future. Medicine, science, and technology are converging. We need to know whether bipolar or schizophrenia patients can use the tools we are designing to help support their treatments.”

It turns out patients and caregivers were not a problem at all. They were willing to participate in these visits. Legal and compliance were also not a problem as the proper approvals, privacy measures and requirements were followed. But in looking into the situation, Profit did discover a bigger problem that revolved around the physicians and psychiatrists. When dealing with patients, Profit found they often use qualitative rather than quantitative data.

“If you ask a psychiatrist how often their patient takes their meds, they will tell you it happens 60 to 80 percent of the time,” she says. “The reality is that number is closer to 30 to 40 percent. Physicians and psychiatrists want to believe their patients are taking their meds, and that’s part of the challenge with serious mental illness. Doctors will keep changing medicines (or increasing dosage amounts) for patients because they think the drugs are not working. In many cases the patients are simply not taking them.”

Redesign The Trial

Those discoveries led Profit to rethink clinical trial design and how technology was being used in trials. She realized that if OPDC wanted to get clinical trial data in real-time, she would need to use tools that could deliver it. For the company’s second clinical trial using digital medicine, Otsuka used eConsent, eSource, and eScanning of clinical supplies. This was also the first time Abilify MyCite (aripiprazole with sensor), a combination drug/device approved to track drug ingestion, was used as a study drug.

“I remember being at home working at my computer,” says Profit. “I was watching my computer screens and I was able to see the exact moment when a patient ingested our drug. I was so happy I hugged my golden retriever and exclaimed, ‘Otsuka did it!’”

Profit was able to see that the patient had completed their eConsent. She was also able to see the eSource data coming in. She was able to verify that the patient checked the chart and could see in real time the patient had ingested their dose. The trial was taking place in California and she was able to view all of this in real time from her home in Maine.

Profit was excited about the data she was seeing, but she knew it wouldn’t have real value unless it was integrated. “Our platform has 64 APIs coming from various data sources including CROs, central lab and ECG vendors, wearables and other sources across 40 clinical trials,” she says. “We needed to get all our data in one place and allow decision makers to visualize relevant data points in real time. Data-related solutions that facilitate data integration and analytics across clinical study teams was the next step. We previously managed complex tables and listing documents that can stretch to more than 1,000 pages. With features like data dashboards and automated alerts, these solutions make it easier to review trial monitoring reports, manage clinical supplies, and comply with regulations and adhere to internal data governance structures.”

Incorporate Digital Therapeutics

For many in the pharma space, digital therapeutics are quickly becoming the new reality. Digital therapeutics are treatment options that use digital (often online) technology to treat a medical or psychological condition. It relies on behavioral and lifestyle changes spurred by a collection of digital impetuses. The methodology uses a variety of digital implements to help manage, monitor, and prevent illnesses in at-risk patients. These treatment options include mobile devices and technologies, apps, sensors, computers, and various Internet of Things devices to modify patient behavior.

“There are currently 490 digital therapeutics in development just in the mental health space,” says Profit. “Pharma companies have to realize they are not just competing against other pills. Today, they are competing against digital therapeutics that could potentially take the place of the pill or supplement a pill. There could be no drug involved at all and the therapy could be less invasive. We now know behaviors can be changed by use of a smartphone.”

That means pharma must start thinking about new competitors. Partnering with digital therapeutics companies is certainly a possibility and Otsuka just recently announced a collaboration agreement with Click Therapeutics to develop and commercialize a prescription digital therapeutic for treatment of Major Depressive Disorder (MDD). But pharma will also be competing with them for talent. Profit saw an infographic recently that showed companies need data scientists and machine learning talent. In the New York area alone, 3,000 job openings exist, but there are far fewer individuals available who meet the criteria. Therefore, partnering might be necessary to gain access to the available talent.

“More and more companies are getting into the health science space,” states Profit. “Companies like Best Buy and home security company ADT. These consumer product companies are in constant contact with the public and that is an advantage for them. Best Buy has invested around $6 billion in life sciences, and we already know that Google and Apple are also making large investments to transform the industry.”

Is Your CRO Along For The Ride?

Now that pharma companies are changing how they get their data, will their CROs be adopting those changes as well? Profit thinks their CROs are trying to change as well. She notes Otsuka’s CROs are primarily using all of OPDC’s internal platforms.

“When it comes to technology, some CROs think they have the Lexus and we have the Toyota,” she quips. “The reality is, what they have is less than a Toyota. Still, the system that each company has is not relevant. The need for real-time data is what’s important. If you are using smart watches in a study, you want to get the data in real-time. Waiting to get that data is no longer a viable option for us. It was hard for our own teams to make that change, and it will be hard for the CROs to do so as well. Using new, available technologies requires a change of mindset.”

In the future, will it be possible for sponsors to get the data they need direct from sites and digital devices? Will bypassing the CRO altogether become an option? If all that becomes required from a CRO is monitoring visits to sites and having discussions with physicians, should pharma take on that responsibility? And will they want to?

“Going forward, not all clinical sites will look like they do now,” says Profit. “Will CVS and Walgreen’s become clinical trial sites? Will urgent care centers? Will we see more trials going to the patient, as opposed to having the patient come to the trial? These are things we must consider. We need to put our focus on the patient and not the site. That will involve having real discussions about how to truly make trials patient-centric.”

There will certainly be naysayers who complain this vision will not work. Profit notes there are colleagues who told her she was crazy when she promoted the idea of conducting trials using a completely e-based platform to deliver digital medicine in patients with serious mental illness. She now makes a point of inviting those naysayers to meetings.

“We know they are going to say that it can’t be done,” she says. “They will tell you there is no way the FDA will approve your plan. When everyone tells you what you can’t do, you begin to have discussions about what you can do. We have a dialogue and discuss what the challenges will be and map out the road forward. We discuss what the trial will look like, the resources required, what the timeline will look like, and what we need to do to manage the change. You can’t say, ‘We’re now going to go direct to patient,’ and not change your process. Mapping out the entire process will allow you to stay on track and ultimately deliver better products to patients and consumers who need our speed and flexibility. The reality is everyone wants progress, but no one wants change. Our job as industry leaders is to help people see the significant benefit of changing the clinical trial process for all in the value chain, most especially the patient.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The Future

These 3 Computing Technologies Will Beat Moore’s Law




There’s a big lie about disruption going around. And folks aren’t spreading it intentionally.

Many smart investors I talk to genuinely believe it to be the truth.

If you accept this widespread lie, you’ll likely make poor decisions when investing in disruptive companies.

Here, I’ll explain the real truth and why it matters to disruption investors.

Your Smartphone Is More Powerful than an Early ‘90s Supercomputer

Your smartphone can do the job of a whole collection of gadgets.

It’s a phone, camera, camcorder, Walkman, watch, wallet, radio, global map, TV, VCR, and computer all in one.

And keep in mind, all a supercomputer does is crunch numbers. We have “Moore’s law” to thank for this.

Named after Intel founder Gordon Moore, it observes that computing power doubles roughly every two years.

This has led to exponential growth in computing power.

As you may know, exponential growth “snowballs” over time. It builds momentum and eventually leads to vertical gains, as you can see here:


For the past few decades, computing power has more or less followed this path.

This Is the Driving Force Behind Moore’s Law

Moore’s Law says the number of transistors that can fit on a computer chip doubles about every two years.

Transistors allow computers to compute. The more transistors you cram onto a chip, the more computing power it has.

Again, for the past 50 years, this has more or less held true. Back in 1965, only 64 transistors fit on the world’s most complex computer chip.

More than 10 billion transistors can fit on today’s chips.

Moore’s law is responsible for many of the giant stock market gains in the past few decades.

Leaps in computing power enabled big disruptors like Apple, Microsoft, and Amazon to achieve huge gains like 50,800%, 159,900%, and 111,560%.

And along the way, the companies that make the computer chips have gotten rich, too.

Taiwan Semiconductor, Micron Technology, and Intel achieved gains of 1,014%, 3,256%, and 35,050%.

Conventional wisdom is that Moore’s law will continue to snowball. As progress gets faster and faster, you can understand why many folks think we’re headed for a tech utopia.

It’s a great story. But it’s not quite true.

Moore’s Law Will Break Down

Moore’s law isn’t really a law. Gravity is a law. Moore’s law is an observation and a forecast.

As I mentioned, since 1965, it has held true. But here’s the key…

Within the next few years, Moore’s law will break down.

You see, although today’s transistors are microscopic, they still take up physical space. There’s a limit to how small you can make anything that occupies physical space.

We are now approaching that limit with transistors. So the progress predicted by Moore’s law must slow.

In fact, Moore’s law is already slowing down. Many technologists predict it will totally break down between 2022–2025.

Does that mean progress will stop?

Not a chance.

New technologies will pick up where Moore’s law leaves off. There are three exciting computing technologies in development you should know about.

3D Computing Hits the Market Later This Year

What does a city do when it runs short on land? It builds skyscrapers.

By building “up,” you can create real estate with the footprint of a one-story building, but one that holds 100X more people.

Something similar is just getting underway in computing.

You see, the “guts” of computers have always been two dimensional. Flat computer chips sit on a flat motherboard. Nothing moves in 3D. There’s no “up” or “down” inside a computer chip.

That’s now changing. In December, Intel (INTC) introduced its new 3D chip technology. It plans to begin selling it later this year.

Tech reporters are touting it as “how Intel will beat Moore’s law.”

Chips stacked in 3D are far superior to ones placed side by side. Not only can you fit multiples of transistors in the same footprint. You can better integrate all the chip’s functions.

This shortens the distance information needs to travel. And it creates many more pathways for information to flow.

The result will be much more speed and power packed into a small space. Eventually, 3D chips could be 1,000 times faster than existing ones.

DNA Computing Is a Bit Further off, but Its Potential Is Mind-Boggling

DNA carries the instructions that enable life.

As incredible as it sounds, DNA can be used for computing. In 1994, a computer scientist at the University of Southern California used DNA to solve a well-known mathematical problem.

One pound of DNA has the capacity to store more information than all the computers ever built.

A thumbnail-size DNA computer could theoretically be more powerful than today’s supercomputers.

I won’t get deep into the science here. DNA computing is still very early stage. But several companies, including Microsoft (MSFT), are working to push the technology forward.

Quantum Computing Could Be the Ultimate Disruption

The science behind quantum computing will bend your mind. To understand its potential, all you really need to know is this.

The basic unit of conventional computation is the bit. The more bits a computer has, the more calculations it can perform at once, and the more powerful it is.

With quantum computing, the basic unit of computation is called a quantum bit—or qubit.

Bits behave linearly. To get a 20-bit computer, you might add 2+2+2+2+2+2+2+2+2+2.

Qubits are different. Every qubit doubles computing power.

So, a 10-qubit computer could do 2x2x2x2x2x2x2x2x2x2 calculations at once, or 1,024.

A 100-qubit quantum computer could perform over 1,000 billion billion billion simultaneous calculations. Those numbers are too big for humans to comprehend.

In theory, a small quantum computer could exceed the power of a regular computer the size of the Milky Way galaxy.

With enough computing firepower, a quantum computer could solve any problem.

If we ever achieve far-out goals like controlling the weather, colonizing Mars, or reversing human aging, quantum computing will likely be the driving force.

There Are No Pure-Play Quantum Computing Stocks

They’re all private or have been scooped up by larger companies.

Many of the big tech players are developing quantum computing technology. Microsoft, IBM, Google (GOOG), and Intel are a few.

Google looks to be in the lead.

In March 2018, it unveiled its Bristlecone quantum processor, which the company thinks could achieve “quantum supremacy.”

Quantum supremacy is the “tipping point” for quantum computing. It’s the point when a quantum computer can beat a regular one in a useful task.

So far, scientists haven’t been able to crack this. But once quantum supremacy is reached, progress should take off very quickly.

This is yet another great reason to consider investing in Google.


Continue Reading

The Future

NVIDIA’s Accelerated Computing Platform To Power Japan’s Fastest AI Supercomputer




Tokyo Tech is in the process of building its next-generation TSUBAME supercomputer featuring NVIDIA GPU technology and the company’s Accelerated Computing Platform. TSUBAME 3.0, as the system will be known, will ultimately be used in tandem with the existing TSUBAME 2.5 system to deliver an estimated 64.3 (in aggregate) PFLOPS of AI computing horsepower.

On its own, TSUBAME 3.0, is expected to offer roughly two times the performance of its predecessor. TSUBAME 3.0 will be built around NVIDIA’s Pascal-based Tesla P100 GPUs, which are not only more efficient, but also higher-performing than previous-generation Maxwell GPUs in terms of performance per watt and performance per die area. It is estimated that TSUBAME 3.0 will deliver roughly 12.2 petaflops of double precision compute performance, which would place it among the world’s 10 fastest systems according to the most recent TOP500 list.

Tokyo Tech Supercomputer

A Rendering Of The Tokyo Tech Supercomputer. Image Credit: NVIDIA

The system’s architect, Tokyo Tech’s Satoshi Matsuoka said, “NVIDIA’s broad AI ecosystem, including thousands of deep learning and inference applications, will enable Tokyo Tech to begin training TSUBAME 3.0 immediately to help us more quickly solve some of the world’s once unsolvable problems.”

TSUBAME 3.0 is being designed with AI computation in mind, and is expected to deliver more than 47 PFLOPS of AI horsepower on its own.

“Artificial intelligence is rapidly becoming a key application for supercomputing,” said Ian Buck, vice president and general manager of Accelerated Computing at NVIDIA. “NVIDIA’s GPU computing platform merges AI with HPC, accelerating computation so that scientists and researchers can drive life-changing advances in such fields as healthcare, energy and transportation.”

TSUBAME 3.0 is expected to be completed this summer. It will be used for education and research at Tokyo Tech, and for information infrastructure for top Japanese universities, though there are plans to make the system accessible to private-sector researchers as well.


Continue Reading

The Future

When 5G is here, a wireless supercomputer will follow you around




Next-generation tech like self-driving cars and augmented reality will need huge amounts of computing power.

AT&T (T) on Tuesday detailed its plan to use “edge computing” and 5G to move data processing to the cloud, in order to better support these new technologies.

“[Edge computing] is like having a wireless supercomputer follow you wherever you go,” AT&T said in a statement.

Rather than sending data to AT&T’s core data centers — which are often hundreds of miles away from customers — it will be sent to the company’s network of towers and offices, located closer to users.

Currently, data is either stored in those data centers or on the device itself.

“[Edge computing] gives the option now to put computing in more than two places,” Andre Fuetsch, president of AT&T Labs and chief technology officer, told CNN Tech.

For example, let’s say you’re wearing VR glasses but the actual virtual reality experience is running in the cloud. There could be a delay in what you see when you move your head if the data center is far away.

AT&T aims to reduce lag time by sending data to locations much closer to you. (AT&T has agreed to acquire Time Warner, the parent company of CNN. The deal is pending regulatory approval.)

5G networks will be driving these efforts. Experts believe 5G will have barely any lag, which means a lot of the computing power currently in your smartphone can be shifted to the cloud. This would extend your phone’s battery life and make apps and services more powerful.

In the case of augmented and virtual reality, superimposing digital images on top of the real world in a believable way requires a lot of processing power. Even if a smartphone can deliver that promise, it would eat up its battery life.

With edge computing, data crunching is moved from the device to the “edge” of the cloud, which is the physical points of the network that are closer to customers.

5G will also enable faster speeds and could even open the door to new robotic manufacturing and medical techniques.

AT&T is rolling out edge computing over the “next few years,” beginning in dense urban areas.


Continue Reading


%d bloggers like this: