The ability of medical scanners to produce images of our bodies in exquisite detail provides a powerful tool for the diagnosis and management of a wide variety of diseases. While we may marvel at the images, we rarely consider the evidence that is required before these scanners can be used in the clinic. Recently we were invited to give a “Little Lecture” as an online live presentation to consider this question.
If we look at the early days of medical imaging it was often simply a case of “it looks good so it must be good”. The birth of medical imaging can be traced back to 1895 and the discovery of x-rays by Wilhelm Röntgen. He produced the first x-ray image (of his wife Anna Bertha's hand) and by 1896 the first clinical studies were being conducted in Glasgow, Scotland. This set the tone for the introduction of new medical imaging techniques that were often adopted shortly after being first demonstrated.
The 1970s saw a big increase in the cost of medical scanners with the introduction of tomographic devices that produced images of slices through the body, thereby removing the problem of organs being superimposed on each other that had been a feature of earlier techniques. Despite this increase in cost, adoption of the technology in the clinic was rapid.
Even technologies that appeared to take a long while to be adopted in the clinic were actually taken up quite quickly once their clinical utility was proved. The potential clinical use of MRI was known as early as the 1950s, but it wasn’t until the invention of spin-warp imaging in Aberdeen that it could be used on living subjects. Once the clinical potential had been demonstrated the technique was soon being used clinically in a number of hospitals.
The turning point came with the development of Positron Emission Tomography (PET); a technique that allows images of function to be produced, including images of metabolism using radiolabelled sugars. While the clinical potential of this was immediately obvious, the cost of those radio-tracers makes it more expensive than the previous modalities and caused health providers around the world to pause before introducing it into the clinic.
The question of whether a technique should be adopted into the clinic should be looked at by considering the opportunity cost i.e. the benefits of other things that could be purchased using the same amount of money such as a new drug, surgical procedure or an extra health worker, for example? This question can be answered by assessing the relative cost-effectiveness of the new imaging technique using decision modelling and this is the method that was ultimately used to justify the introduction of PET into the clinic in Scotland.
If you’d like to learn more about medical imaging technologies, you might want to consider our MSc programmed in Medical Physics or Medical Imaging. If you’d like to learn more about health economics including the methods used to measure clinical and cost effectiveness then our MSc, PgDip or PgCert in Health Economics for Health Professionals might be for you. Or you can do a short course in Health Economics.
You can watch Prof Andy Welch and Prof Marjon Van der Pol's Little Lecture here.