Careers

Videos


Spaulding Clinical Research Videos  |  22 May 2017

Could Your Cardiac Safety Process Belong in a Museum?

 
What's Restraining Your Centralized Cardiac Safety Process?
 
Are you still paying big fees to ship ECG machines around the world; waiting weeks or months for your data or collecting paper?  That model belongs in a museum.

At Spaulding Clinical Research we've pioneered a modern process to meet modern demands. Watch Now

 

Farewell to TQT Studies by Dr. Jay Mason

 

 

Doctor Jay Mason

Dr. Jay Mason

Dr. Jay Mason is a world-renowned cardiac safety drug expert with 35-plus years of experience in cardiac care and research. He is currently the chief medical officer for Spaulding Clinical Research (SCR). In this Webinar, Dr. Mason discusses surveillance strategies to improve safety and avoid the Thorough QT study.

[ DOWNLOAD THE SLIDES ]

------------------------------------------------------------------------------------------------------

VIDEO TRANSCRIPT

Mr. Salmon:

Thank you Rebecca.

I'm very pleased to be here monitoring today's webinar entitled farewell to the TQT studies. I'm also very happy to welcome all of you on behalf of Spaulding clinical where I serve as the General Manager and Senior VP of research.   I've had several clients comment to me on what a surprising title this was for today's Webinar, considering Spaulding clinical expertise in Thorough QT studies. Only we are experts in conducting the QT studies we actually do conduct all types of clinical pharmacology studies and what we are focusing on today is discussing the growing trend for building your cardiac safety profile for your compound starting with your first in human study.

We have two great experts as presenters this morning and I don't want to take any more of their valuable time so let me start by introducing Dr. Jay Mason.  Doctor Mason is a world-renowned Cardiac safety drug expert for 35+ years of experience in cardiac care and research.  He graduated from Princeton and obtained his M.D. degree from the University of Pensylvania. 

He also trained in medicine and cardiovascular disease at Stanford University where he was a member of the faculty from 1975 to 1983. He was chief cardiology at University of Utah from 1983to 1999 where he still serve as a faculty member and served as chairman of the department of medicine at University of Kentucky from 1999 to 2003.

Dr. Mason's clinical, teaching and research emphasis is in cardiac arrhythmia and electrophysiology. He is offered over 400 publications and has served on the National research review committee for the NIH and the American Heart Association.

He also served on several editorial boards including the American Journal of cardiology circulation Annals of Internal Medicine, American Journal of medicine and the Journal of American College of Cardiology.

Dr. Mason is currently the chief medical officer for Spaulding clinical research and his topic today is 'cardiac surveillance strategies to improve safety and avoid the Thorough QT study.'

Before he gets started we have a couple of poll questions for the audience so will our backup, please present the first poll question.

 

Rebecca:

Thank you very much Mr. Salmon.  Our first question for today is

"Do you think it is necessary to perform some form of repolarization testing in human on all or nearly all drugs?" that is, is the arrhythmia problem, common enough that the message?  For testing and preventing is it effective enough to justify the requirement, ahh, the time required and cost? You have two options here you can answer 'yes' or 'no'.

 

Dr. Mason: 

Rebecca this is Jay Mason, should questions be appearing on the screen?

 

Rebecca:

I believe they should, sorry about that. Let me just make sure they are.

 

Dr. Mason:

They aren't on mine perhaps they are on others.

 

Rebecca: 

So it looks like we have a few...71% of you already voted, so I'll just repeat that question.

Do you think it's necessary to perform some form of repolarization testing in humans on all or nearly all drugs?  And you have two options;  you can click 'yes' or 'no'.

Hey! looks like all of you have committed your answers and I'll just share these now.

So the answer to the question, do you think it's necessary to perform some form of repolarization testing in humans on all or nearly all drugs?

 

Vote:  66% of you answered 'Yes' while 34% of you mentioned 'No'

 

And Dr. Mason do you see these results now?

 

Dr. Mason:

No, I don't. I don't see the poll or anything other than the chat box that....I pulled up...

 

Rebecca:

Okay, why don't I go ahead and start off and ask the second poll question and then we can talk privately making sure that you are all setup for your presentation.

 

Dr. Mason:

Good. Okay.

 

Rebecca:

Right, so as mentioned we have a second poll question for you today and let me just go and launch that right now. So our second poll question for this morning.

 

Question:  Do you think the E14 EQTS is an appropriate method for preventing drug-induced TDP?

 

 And again you have two choices. You can click 'Yes' or 'No'.

 

Great! Thank you everyone.  Almost all of you voted today and it seems like a straight 50/50 split. So in answer to the question:

 

Question:   Do you think the E14 EQTS is an appropriate method for preventing drug-induced TDP?

 

Vote:  50% of you said yes and 50% of you said no.

And so our last poll question for this part of the presentation anyway...

 

Dr. Mason:

Rebecca, what's the percentage there?

 

Rebecca:

Right now. And you can see it on your screen right now. So a third poll question for this morning.

 

Question:  If there is a no TQT requirement, would your company likely bring more drugs to market?  And again you have two options here'Yes' or 'No'

 

Thank you everyone, for participating in our last poll question which was "If there were no TQT requirements, would your company likely bring more drugs to market?"

 

Vote:  42% of you said yes, while 58% of you mentioned no.

 

So at this point I'd like to pass the presentation over to Dr. Mason and I'd like to hear his insight on what he thinks we gathered from the poll questions.

 

Dr. Mason:

Thanks Rebecca I wonder if we still have the problem I'm sharing my screen but I can't see anything other than the chat box.  Can you see my screen?

 

Rebecca:

Yes, Doctor Mason we can see your screen. Would you mind start loading up your PowerPoint presentation?

 

Dr. Mason:

I will do just that and then I will tell you what I think about these answers

 

Rebecca:

Yeah. I think we're all dying to know. Great! So you press the poll questions presented right on your screen, press the key here today and I'll just let you take it from here.

 

Dr. Mason:

Rebecca, is the chat box visible to you?

 

Rebecca:

No...not at all, all we can see right now is your PowerPoint presentation as well as the tool bar with just some of the applications you have installed on your computer like Microsoft Office and....

 

Dr. Mason:

Okay very good.

 

The first question ... the majority, two-thirds said yes and  I certainly think that's appropriate as we briefly discuss there is a substantial incidence of arrhythmia problems due to QT prolongation and obviously this needs to be addressed in some way.

 

The second question is about what the response is about what I expected and what I believe half of you said that the E14 method for assessing repolarization liability is a good way to do it; that means that half of you don't like the E14 approach and I must say that I don't think it's ideal, in fact that's really the topic of this webinar.

 

Number three would more drugs be developed?

 

That's worrisome 42% said yes and that's a reason why both I and Dr. Benson and many others do feel strongly that the Thorough QT study should be altered in some way so it is no longer an obstacle to drug development.

 

Okay, let's go on to the...that's interesting I'm still not able to advance my slides. How about this way? Here we go! Everyone should be seeing a table listing drugs that have been withdrawn from the market for QT prolongation. Rebecca if you are not seeing that let me know right away.  The list is pretty long there are 16 listed here of the problem started in 1988 and the most recent withdrawal was the higher IV dose preparation of ondansetron.  So it is a substantial problem. 

Here's what we're going to talk about during the next 25 minutes.  I'll briefly review what the original E14 guidance required and then I'll update you on new requirements and clarifications that have resulted from the various Q&A that have been issued by ICH FDA and health Canada and I'll add this clarifications that routinely appear in FDA response letters to Thorough QT study protocol submissions.  Then I'm going to give you a list of what I consider to be the more important deficiencies of the current E14 approach and I'll follow that by providing alternative strategies in each case.

This is the original guidance I'm sure that everyone is familiar with it and I won't spend any time other than this slide to discuss what its requirements are, as you know it applies to almost new drugs, almost all new drugs and roughly 95+ percent of new drugs in fact are having to undergo this requirement.  The requirement is to detect a 5 ms change in QTC within a common boundary of 10 ms. I won't get into the statistical implications of this beyond indicating that this means you have to do a lot of ECGs in a fairly large study. The number of ECGs and studies I’ve been associated with range from 4000 to 30,000 ECGs. The E14 requires that ECG interpretation be centralized and the centralization should also include a uniformity of equipment. The study must have a placebo control and an active control which is usually moxifloxacin and whenever possible it should explore a super therapeutic dose which ideally should be at least four times and up to 10 times the therapeutic dose.  These studies are intended to be performed prior to approval of the drug of course, and most commonly after phase 2 and before phase 3 begins. 

Another requirement which isn't actually part of the E14 is that the electrocardiograms from these studies are submitted in an XML format to an ECG warehouse maintained by Mortara.

Now this set of requirements has been added to and clarified over the years through the Q&A process and through direct communication from FDA. Most important change is that FDA now requires that the thorough analysis be applied to Heart rate PR and QRS in addition to QTC.  The FDA no longer requires that a clinical dose group be included in the study that is, in most cases of super therapeutic dose armed is all that's needed so long as the PKPD measurements are appropriate to allow a good accurate PK/PD analysis.

Automation is acceptable and it is internally validated in the Thorough QT study by the presence of a positive control usually moxifloxacin.  I will touch on this topic of automation again in a moment. I skipped the moxifloxacin bullet by accident but as most of you know originally there was a great deal of effort in blinding moxifloxacin that's no longer required by FDA. I personally believe it it's a good idea to blind if you can.

 

There have been recommendations offered by Vang and Machado from FDA on how to calculate sample size and most statisticians have accepted this approach not because they necessarily agree but because they know that FDA will accept it. The correction is no longer no longer required; that's good.  Multiple endpoint adjustment to control for type I error is required for the assay sensitivity test them most commonly the Hartford procedures used if you don't include this in your protocol FDA will ring you and ask for it.  You also need to account for possible late effects typically this means the channel trafficking issue and that means that you need to have at least one-time point at 24 hours or later.  And finally FDA is requesting that the KPD modeling be done in all studies.  Now let's get to the topic of what is wrong, in my opinion with the Thorough QT study is the defined by the E14 and let's do that in order to consider how we can improve these issues.

The first problem I see is timing.  As you can see on the graph on the left in this slide, I'm displaying curves for ECG quality in white that basically means the centralization and robust ECG methods, ECG quantity, the number of ECGs performed in red and the resultant information in green and as you can see in most cases ECG quality is poor the quantity is low and the amount of information about ECG effects collected during phase 1 and phase 2 is minimal.

Now this is not the way all our responses perform studies but the majority do, that means that we don't start collecting a lot of information and improving our knowledge about ECG drug effects until just before phase 3 begin and here we see the enormous spike in all three curves followed by a decline in ECG quantity and in ECG quality, in other words non-centralization in phase 3 so long as the Thorough QT study was negative.

 

Now this timing in my view presents two major problems the first one is that

1.      ECG quality, quantity and resulting information are the lowest in phase 1 but ironically the highest doses that are going to be seen in the most careful exploration of pharmacokinetics occurs in phase 1. So we're missing a tremendous opportunity.

 

2.      The second problem is that subjects are exposed to the drug in phases one and two at super therapeutic doses and often there are several hundred subjects and patients exposed in these phases and my question is aren’t these people entitled to the same protection that the Thorough QT study is designed to guarantee in phase 3. Ostensibly the QT study is done at the beginning of phase 3 before exposure is greatly increased to the drug but there are quite a few exposures prior to that point and that doesn't seem rational to me. 

 

So what could we do to improve this timing issue? 

 

If you take a quick look at the graph on the left I think you'll immediately figure out what I'm thinking here and that is that we use rigorous electrocardiography throughout the drug development cycle; we frontload ECG acquisition so we're getting the highest number of ECGs at the very beginning of human exposure, this centralization high quality persists through the entire development period.  But the frequency of time points and the number of replicates at each time point can be diminished as the ECG information is gained and eventually the quantity of ECGs can become quite low. The result of this approach is that high-quality ECG information is gathered during maximum dosage and early warning of QT effects and other ECG issues is available, this improves patient and subjects safety from the very beginning and also gives the sponsor an opportunity to get out early, so to speak, if the drug is misbehaving in relationship to ECG effects.

Of course taking this approach avoids the formal Thorough QT study and I think if you take pencil to paper you'll find that this will reduce the cost of drug development overall.

 

Another issue, moxifloxacin exposure .The fact is that moxifloxacin is not totally benign it does prolong QT and there are reports of first part???  Ventricular tachycardia occurring in subjects receiving moxifloxacin and re-challenges have been done to pretty well confirm that moxifloxacin does do this, it's somewhat ironic that we’re required to administer this drug to normal volunteers in an effort to prevent this very problem.

Moxifloxacin also can cause hypersensitivity reactions,  they are rare but Stevens-Johnson syndrome and other severe hypersensitivities may occur, it also can cause C the C Stills associated disease after a single dose and tendon ruptures are reported though I suspect they would never occur after just one dose.

 

What is the alternative for an active control?

Well first of all, I believe that a high quantity and quality of ECGs early during drug development reduces the need for an active control.  Also we now know with the many... several years of experience with Thorough QT studies what the methods are and where the investigative sites are [00:27:21.07]that indeed can accurately measure QT and therefore don't need to be actively controlled.  Also, if we maintain high-quality electrocardiography in phases three and four we reduce the chance of mystery repolarization liability of the drug so even if somehow we weren't doing a good job earlier on in measuring the QT we retain the possibility of identifying problems later on by maintaining high ECG quality.  And indeed there are other very sound methods for demonstrating assay sensitivity that have been proposed. None of been used as the only method for active control as far as I know, but I may be wrong about that.  Detecting the known changing QTC with positional change is one method and using statistical methods to discern QTC precision is another approach.

One other problem with the Thorough QT study is the very endpoint itself QTC. QTC prolongation has extremely low specificity for subsequent development of percent on ??? (0028.57) and that's a big problem the probably the biggest problem that you can point to.  It can be improved on. T-wave change, change in the morphology of the T-wave has higher specificity on animal models by a considerable. Currently, the waves are assessed objectively by eyeball, by the cardiologist over reader of this subjective assessment is totally unquantified and I think less reliable than quantification would be.  So that all my alternative strategy here is to do other measurements that one of them being to quantify the T-wave this might be done by a T-wave segmentation such as the TPTM measurement or it might be done by a method that I'm going to illustrate in which the principal components or eigenvectors of the T-wave shape are quantified in then monitor over time.

You're looking at an X by Y plot of T-wave morphology and when I click the start button you're going to see a movie of T-wave morphology change but at the first I need to explain you what's going on here. These data were generated by bio QT which is a software utility produced by Oxford bio-signals for automated measurement of ECG intervals and T-wave morphology we will be looking at T-wave morphology here This steel frame shows a two-dimensional representation of the first six eigenvectors that describe the shape of the T-wave these six dimensions or principle compliments are resolved to a two-dimensional depiction by a neural clustering technique which I don't understand and which I actively avoid learning anything about. But the 2-D displaying does make it possible for mathematical idiots like us to actually see how the T-wave morphology changes.  Each heartbeat during the 24 hour Holter recording is represented in this X by Y plot by these blue dots each blue dot is a T-wave morphology measurement.

Now what I'm going to do is I'm going to run the movie and you're going to see these blue dots turning yellow briefly, these are... these yellow beats are the beats that are occurring during a 10 minute window of time. This 10 minute window of time and affects lives of the data the chronological sequence from the beginning of the Holter to the end, so you see the morphology measurement of the current beats moving through this XY plot, you will see that.  We are actually only going to look at the first nine hours of this recording and the data will be going at a rate of one hour real-time per one second display time it's a fast movie, it only last about 20 seconds and I'll  show it twice so you can really get the idea what you’re going to see after the first half-hour of the recording so low it's administered at 320 mg orally in this particular subject and then you begin to see dramatic T-wave change induced by the drug. At the upper left you're going to see Holter time displayed and over on the right you may see some messages but I just ignore them we're not going to deal with them today.

Now as the movie plays look at this pink ball right here. What that represents is an average of the 10 minute window that is currently lit up in yellow so it gives you the average T-wave morphology at any point in time, but please don't worry about these other static green and red bubbles that we're not going to talk about them today. So I'll run the movie now, I'll run it again one you can see the ball was fairly static but now it's really starting to move to different places.  The drug effective is strong and you can see huge T-wave change and indeed you see a great deal of the increase in variability of the T-wave by a greater spread of the yellow dots. I'll play this again so you can appreciate that. Dots are close together balls not moving much now; they're starting to be spread in the dots and the balls moving further and faster.  Pretty soon they stopped all over a huge effective solo on the T-wave.  So basically what we've done here as we've taken a rich set of quantitative data describing the shape of the T-wave and turn it into a subjective moving picture, but please rest assured that these data can be used to detect and measure very precisely the extent of T-wave morphology vertebra morphology perturbation.

Now let's go on to the next slide and that's cost. The Thorough QT study is an expensive study.  I've seen costs ranging from half million dollars to $10 million, that half a million dollar cost by the way did not pass through QTR to successfully that study ended up causing costing 850,000, so it's too expensive and it truly inhibits drug development at small bio farmers with the low cash reserves of the Thorough QT study can literally halt the requirement...can halt drug development. 

 

So what are some cost improvement strategies?

One of them is automation, automation of the ECG measurements this is faster, it's much less expensive if done properly, eliminates human error, it reduces variability in the sample size requirement is reduced as result and it makes very difficult or impossible things possible it allows a good QTCI measurement based on adequate baseline data to be calculated. It allows T-wave morphology tracking as we've just seen and it allows a number of other analyses that require large quantities of data that you could never ask a cardiologist to measure in any reasonable time period.

 And importantly the FDA accepts these data and Thorough QT studies that have active controls. there are a number of types of automation one is the automated algorithms that you're familiar with in ECG machines but these apply primarily to  single 10 second and 12  Led ECG rather than to continuous data. There are several I've listed three of the new stand-alone software products that are capable of measuring  Holter data and making a number of useful analyses possible so automation is one where I think we can the improve cost.

 

2.  Another is to use purpose-built devices. I'm showing a device called the Spalding IQ that was developed at Spaulding clinical research this device was built for Pharma studies and the underlying motivation was to allow pharmaceutical companies to collect data; ECG data inexpensively. This machine reduces the cost of manufacturing and therefore the cost of leasing the electrocardiograph to Pharma-sponsors. Its small size and low weight, weighs only three ounces.  Reduces the cost of shipping, its simplicity you can see in the panel on the right that there really are only three steps to running this device and it has only one button it has one small LED display and is very simple. So this reduces training costs and it reduces human error because of the simplicity and also because it has voice biometrics instead of the keyboard entry for a subject ID, it enhances data quantity and quality because it provides a five-minute recording which allows the development of a more reliable result. So using devices built for the job is another way to reduce costs. You're looking at the modified the Picassos dream no more to QT's modification there is a pill bottle that's not in the original.

 

So, in summary the alternative strategy that I'm recommending has these elements:

·        front load and taper ECG acquisition to get an earlier repolarization answer

·        eliminate the active control for patient safety and cost reduction

·        monitor T-wave morphology to get a better assessment of risk

·        automate ECG analysis reduce costs and use more cost-effective devices and analyses

 

Thanks for your attention.  I believe Rebecca; you will take over the screen display now?

 

Rebecca:

Yes, that's right Doctor Mason.

Thank you so much for that presentation and I'm just going to switch gears a little and come back to my screen.

 

So as I mentioned thank you Doctor Mason.  I see that we have some great questions coming in and I would encourage all members of our audience to keep them coming we'd love to get your input and see what you are thinking about.

So at this point I want to turn over the presentation over to Mr. Daniel Salmon who might want to say a few words to Dr. Charles Benson who is our next speaker.

 

Mr. Salmon:

Thank you Rebecca.

Evaluating a Thorough QT Assessment Program

 

 

Robert Kent

Robert Kent

Robert Kent is the Phase 1 and ECG Centralization Business Development at Spaulding Clinical.

 

Introduction

Since the adoption of ICH E14 guidance “Clinical Evaluation of QT/QTc Interval Prolongation and Proarrhythmic Potential for Non-Antiarrhythmic Drugs” (2005), Pharmaceutical companies engaged in development of new drugs have been required to investigate the potential for a proarrhythmic effect by means of analysis of QT/QTc data derived from ECG recordings. Usually, this has been accomplished by carrying out a Thorough QT (TQT) Study.

While the implementation of the ICH E14 guidance has been successful in allowing investigators and regulators to identify compounds that can significantly increase the risk of cardiac arrhythmias and sudden death and stop the development of these drugs, it has become apparent that the adoption of the guidance has had unintended consequences that are acting as an impediment to the successful development of new drugs. These include:

  1. Delays in the development and consequent regulatory approval due to the guidance recommendation that the TQT study is undertaken between the completion of Phase II and the start of Phase III studies.
  2. The QT interval itself is an imperfect biomarker for proarrhythmia. This has undoubtedly resulted in the premature discontinuation of many potentially effective drugs that in reality would not have posed a risk of severe cardiac side effects.

A consequence of these limitations, FDA, in conjunction with the Cardiac Safety Research Consortium (CSRC), has been looking at ways that the TQT study goals can be achieved more efficiently and accurately.  There are two alternate proposals currently being assessed:

  1. More robust pre-clinical testing utilizing the CiPA proposal (Comprehensive in Vitro Proarrhythmia Assay]. While this approach may show promise in theory, it is years away from being a suitable replacement for clinical testing.
  2. Replacement of the TQT study with increased ECG collection in SAD/MAD Phase I studies in conjunction with concentration-effect modeling. A pilot study carried out with the support of the CSRC showed this approach was able to replicate the results that would be generated by a TQT study. Unfortunately, the exciting implications of this study have been somewhat marred by some controversy concerning claims about “acceptable” methodology for analyzing the ECG data. However, it is not within the remit of this posting to enter this debate except to say that there are several ways of analyzing the ECG data that may be acceptable to the relevant regulatory authorities and that one methodology is not preferred over the others.

If this approach does eventually replace the TQT study, the smaller cohort sizes typically found in SAD/MAD studies will result in a smaller number of ECG’s available to be used for analysis. Consequently, accurate prediction of QT effect will require, more than in the past, that all ECG’s collected are of the highest quality possible.

Spaulding Clinical Research looks at three key metrics to when determining the quality of ECG data collected and analyzed for the assessment of QT data; the % of readable ECGs:

Percent of Readable ECG’s

While it might seem self-evident that it is essential that readable ECG’s are collected by the investigative site at the correct time points relative to the PK/PD profile of the drug it is remarkable how infrequently this quality metric is requested. The primary factor that has a detrimental effect on the percent of readable ECG’s is the quality and surveillance of the connection between ECG device and subject/patient.

Poor quality connections result in unusable tracings. Unfortunately as a result of movement, moisture build up between skin and electrode etc. the quality of the electrode to skin contact deteriorates the longer the electrode is in place. In order to achieve the optimal level of % of readable reads, it is necessary to have well-trained site staff carry out continuous real-time surveillance of the ECG tracings. This is best undertaken at a site that has installed ECG equipment that allows both continuous collection of 12-lead ECG recording and allows real-time review for quality, such as the Mortara Surveyor system. Sites using this system have the ability to assess the ECG data quality prior to the time point and take corrective action. Other methodologies are not as reliable: standard 12-lead devices require the site staff to take the ECG’s in real-time but if there is a problem the correction has to be made and the ECG may be delayed.

The most common way that ECG’s are routinely collected for QT analysis is by utilizing 12-Lead Holter equipment.  Unfortunately, a major issue with 12-Lead Holter recording is that most systems do not allow for real-time quality review; as a result the percent of readable ECGs is lower. In a related study (Salvi V et al. 2011) investigating the incidence of lead misplacement in ECG’s collected for clinical studies found that limb lead placement errors occurred in 3.4% of all the ECG’s reviewed (n=85,133) but the percent of limb lead placement errors was 7.5% in those ECG’s derived from Holter data. While limb lead misplacement does not necessarily result in an unusable ECG, this difference is illustrative of the quality problems associated with 12-Lead Holter collection and solutions based on them.

As we move to an earlier assessment of the QT interval with smaller sample sizes, data loss due to unreadable ECG’s will become less acceptable.

Heart Rate SD

Most QT correction formulas are a function of Heart Rate (HR), the more stable the HR the more stable the QTc will be. Unfortunately, the correlation between HR and QT is never strictly linear so achieving HR stability is extremely important as this maximizes the chance any variation in QTc is a drug effect and not the result of external stimulation. As HR is sensitive to multiple external influences it is imperative that the Clinical Pharmacology Unit undertaking the clinical conduct of the study makes every effort to ensure that the effects of external stimuli are minimized.

QTcF SD

Fridericia’s QT correction (QTcF) is the standard correction formula submitted to the FDA for evaluation of potential drug-induced QT prolongation.

As the QT interval is dependent on HR, the strict control of factors affecting HR as described in point 2 are obviously important in reducing the variation in QTcF. Additionally the accuracy of the measurement is dependent on the quality of the ECG, as poor quality ECG’s often do not allow accurate determination of the end of the T wave. This applies equally to ECG data that it is interpreted by an automatic algorithm, by a cardiologist/technician or a combination of automated measurement with human review. Adequately trained personnel are essential if consistency and, thus, lower variation is to be obtained in QTcF measurements.

Conclusion

High-quality ECG data is dependent on the experience and processes of the investigative site collecting the data much more than on the experience of the Core ECG laboratory.

Spaulding Clinical Research, which uniquely operates a 200-bed Clinical Pharmacology Unit and an ECG Core Laboratory, has collected its data on these factors since its inception. Our metrics are shown below.

--------------------------------------------------------------------------------

Types of Resource Materials

To read, view, and/or download any of our resource materials marked:

  • Brochures TBD
  • Capabilities TBD
  • Case Studies TBD
  • Infographics TBD
  • Press Releases TBD
  • Publications
  • Reports TBD
  • Slides TBD
  • Webinars
  • Whitepapers TBD

navigate to the corresponding Resources Tab and dropdown menu / vertical sidebar menu.

More Articles...

Visit Us Virtually! Enter the Booth