Advanced Technologies in Healthcare

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

21st March 2019

4 minute read

Home » Medical & Healthcare

Some of the biggest changes in the practice of medicine and healthcare over the past 70 years have resulted from improvements in the way diseases and illnesses can be diagnosed and studied. Innovative technologies now allow doctors to discover increasing amounts of detailed information about both the progression and treatment of disease, allowing new treatment options and care pathways.

The most significant developments which are likely to change the face of medicine over the next few decades include:

  • Enhanced self-management for patients and the elderly through technology support systems to empower understanding and control of conditions.
  • Improved patient access to health service infrastructure through utilisation of remote care and monitoring systems.
  • Further developments in medical imaging and the application of Artificial Intelligence systems to effectively analyse and diagnose conditions.
  • Precision medicine that can target medical interventions to specific sub-groups of patients based on genomic data.
  • Robotic surgical systems that can conduct exquisitely precise operations in difficult-to-reach anatomical areas without flagging or losing concentration.

Self-Management for Patients

Day-to-day physiological monitoring technology, driven particularly by the spread of a variety of consumer wearable devices with communication capabilities, has the ability to collect and integrate health information from a variety of sources, both medical and consumer-based. The next generation of wearables is likely to significantly blur the division between technology lifestyle accessory and medical device, as reliable non-invasive sensors for the measurement of blood pressure, blood sugar, body temperature, pulse rate, hydration level and many more become increasingly implemented within these devices. The provision and integration of these derived complex sets of data has the potential to provide valuable information, that enabling a holistic approach to healthcare. The US FDA is currently working closely with industry to facilitate the introduction and effective use of these more advanced devices.

Enhanced Patient Access

In the UK, the NHS has brought high-quality medical services to every citizen, but often at the cost of long waits for visits to the doctor when a patient is concerned about his health. The introduction of improved access systems, including video-conferencing facilities, electronic health records and AI-powered chatbots, promises to be a powerful and game-changing move. In particular, chatbots systems such as Babylon Health or Ada can provide a highly accessible medical triage procedure, which can alleviate the pressure on over-worked doctors in GP surgeries, and allow those doctors to focus on patients with more serious conditions. With increasing sophistication, these chatbots can potentially provide accurate diagnostic advice on common ailments without any human interaction or involvement. The key concern is, of course, ensuring that the algorithms operate with patient safety foremost, which requires fine tuning to balance between over-caution and under diagnosis.

Medical Imaging and Artificial Intelligence

Following admission to a hospital, a key element of modern medicine is the use of imaging systems for clinical diagnosis, and the main challenge for doctors is to interpret the complexity and dynamic changes of these images. Currently, most interpretations are performed by human experts, which can be time-consuming, expensive and suffer from human error due to visual fatigue. Recent advances in machine learning systems have demonstrated that computers can extract richer information from images, with a corresponding increase in reliability and accuracy. Eventually, Artificial Intelligence will be able to identify and extract novel features that are not discernible to human viewers, allowing enhanced capabilities for medical intervention. This will allow doctors to re-focus on their interaction with patients, which is often cited as the most valued aspect of medical intervention.

Precision Medicine

The current paradigm for medical treatment is changing through the development of powerful new tools for genome sequencing which allows scientists to understand how genes affect human health. Medical decisions can now take account of genetic information, allowing doctors to tailor specific treatments and prevention strategies for individual patients.

In essence, precision medicine is able to classify patients into sub-populations that are likely to differ in their response to a specific treatment. Therapeutic interventions can then be concentrated on those who will benefit, sparing expense and often unpleasant side effects for those who will not.

Robotic Surgery

Currently, robotic surgical devices are simply instruments that can translate actions outside the patient to inside the patient, often working through incisions as small as 8mm. The benefits of this are clear in terms of minimally invasive surgery, and by allowing surgeons to conduct the operations in a relaxed and stress-free environment. At the moment the robot does not do anything without direct input, but with the increasing development of AI systems, it is likely that in 10 or 15 years, certain parts of an operation such as suturing may be performed automatically by a robot, albeit under close supervision.

What will new technology mean for healthcare?

It is fiendishly difficult to predict the impact of innovative technological advances on medical practice and patient care. However, the overall message is clear – improvements in front end technology will allow patients to have a greater responsibility for their own personal health and well-being. Increased access to medical practice through innovative and efficient mechanisms will allow doctors to focus their time on the patients identified as suffering from more serious illnesses. Highly trained AI systems can then complement the doctors’ prowess in identifying and diagnosing particular diseases. Finally, treatment options will be highly tailored to individual patients and their conditions, increasing the cost-effectiveness of treatment.

However, each of these technology developments comes with associated costs and challenges. Not least, new technology could fundamentally change the way that medical staff work, requiring new skills and mindsets to effectively transform medical care into a radically new approach.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Some of the biggest changes in the practice of medicine and healthcare over the past 70 years have resulted from improvements in the way diseases and illnesses can be diagnosed and studied. Innovative technologies now allow doctors to discover increasing amounts of detailed information about both the progression and treatment of disease, allowing new treatment options and care pathways.

The most significant developments which are likely to change the face of medicine over the next few decades include:

  • Enhanced self-management for patients and the elderly through technology support systems to empower understanding and control of conditions.
  • Improved patient access to health service infrastructure through utilisation of remote care and monitoring systems.
  • Further developments in medical imaging and the application of Artificial Intelligence systems to effectively analyse and diagnose conditions.
  • Precision medicine that can target medical interventions to specific sub-groups of patients based on genomic data.
  • Robotic surgical systems that can conduct exquisitely precise operations in difficult-to-reach anatomical areas without flagging or losing concentration.

Self-Management for Patients

Day-to-day physiological monitoring technology, driven particularly by the spread of a variety of consumer wearable devices with communication capabilities, has the ability to collect and integrate health information from a variety of sources, both medical and consumer-based. The next generation of wearables is likely to significantly blur the division between technology lifestyle accessory and medical device, as reliable non-invasive sensors for the measurement of blood pressure, blood sugar, body temperature, pulse rate, hydration level and many more become increasingly implemented within these devices. The provision and integration of these derived complex sets of data has the potential to provide valuable information, that enabling a holistic approach to healthcare. The US FDA is currently working closely with industry to facilitate the introduction and effective use of these more advanced devices.

Enhanced Patient Access

In the UK, the NHS has brought high-quality medical services to every citizen, but often at the cost of long waits for visits to the doctor when a patient is concerned about his health. The introduction of improved access systems, including video-conferencing facilities, electronic health records and AI-powered chatbots, promises to be a powerful and game-changing move. In particular, chatbots systems such as Babylon Health or Ada can provide a highly accessible medical triage procedure, which can alleviate the pressure on over-worked doctors in GP surgeries, and allow those doctors to focus on patients with more serious conditions. With increasing sophistication, these chatbots can potentially provide accurate diagnostic advice on common ailments without any human interaction or involvement. The key concern is, of course, ensuring that the algorithms operate with patient safety foremost, which requires fine tuning to balance between over-caution and under diagnosis.

Medical Imaging and Artificial Intelligence

Following admission to a hospital, a key element of modern medicine is the use of imaging systems for clinical diagnosis, and the main challenge for doctors is to interpret the complexity and dynamic changes of these images. Currently, most interpretations are performed by human experts, which can be time-consuming, expensive and suffer from human error due to visual fatigue. Recent advances in machine learning systems have demonstrated that computers can extract richer information from images, with a corresponding increase in reliability and accuracy. Eventually, Artificial Intelligence will be able to identify and extract novel features that are not discernible to human viewers, allowing enhanced capabilities for medical intervention. This will allow doctors to re-focus on their interaction with patients, which is often cited as the most valued aspect of medical intervention.

Precision Medicine

The current paradigm for medical treatment is changing through the development of powerful new tools for genome sequencing which allows scientists to understand how genes affect human health. Medical decisions can now take account of genetic information, allowing doctors to tailor specific treatments and prevention strategies for individual patients.
In essence, precision medicine is able to classify patients into sub-populations that are likely to differ in their response to a specific treatment. Therapeutic interventions can then be concentrated on those who will benefit, sparing expense and often unpleasant side effects for those who will not.

Robotic Surgery

Currently, robotic surgical devices are simply instruments that can translate actions outside the patient to inside the patient, often working through incisions as small as 8mm. The benefits of this are clear in terms of minimally invasive surgery, and by allowing surgeons to conduct the operations in a relaxed and stress-free environment. At the moment the robot does not do anything without direct input, but with the increasing development of AI systems, it is likely that in 10 or 15 years, certain parts of an operation such as suturing may be performed automatically by a robot, albeit under close supervision.

What will new technology mean for healthcare?

It is fiendishly difficult to predict the impact of innovative technological advances on medical practice and patient care. However, the overall message is clear – improvements in front end technology will allow patients to have a greater responsibility for their own personal health and well-being. Increased access to medical practice through innovative and efficient mechanisms will allow doctors to focus their time on the patients identified as suffering from more serious illnesses. Highly trained AI systems can then complement the doctors’ prowess in identifying and diagnosing particular diseases. Finally, treatment options will be highly tailored to individual patients and their conditions, increasing the cost-effectiveness of treatment.
However, each of these technology developments comes with associated costs and challenges. Not least, new technology could fundamentally change the way that medical staff work, requiring new skills and mindsets to effectively transform medical care into a radically new approach.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

The New Science of Genetic Medicine

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

9th January 2019

Home » Medical & Healthcare

With surprisingly little fanfare, in October 2018 NHS England became the first health service in the world to routinely offer genetic medicine in the fight to treat cancer.

From that date, hospitals across England have been linked to specialist centres that can read, analyse and interpret DNA isolated from patients with cancer. Through this service, cancer patients can be screened for the existence of key mutations within their tumours that can indicate the best drugs for treatment or to point towards clinical trials of experimental therapies that may be beneficial.

The move marks a big step towardprecision medicine, which offers more effective therapies that are tailored to individual patients.

What is the science underpinning this move?

Firstly, a quick crash course in cancer biology:

  • Cells are the building blocks of every living organism.The instructions (or genes) that tell a cell how to develop and what to do are encoded in long linear molecules of DNA found in the nucleus of the cell.
  • These DNA molecules can be damaged over time or through exposure to chemicals or environmental changes. Cells become cancerous when specific changes in the DNA, called ‘driver mutations’, tell cells to grow faster and behave abnormally.
  • Many cancers form solid tumours, which are masses of tissue, while cancers of the blood, such as leukaemia, generally do not form solid tumours.
  • As these cancer cells multiply to form a tumour, selective pressure increases the number and type of harmful mutations found within the DNA.
  • The cells may acquire additional properties through mutation, such as malignancy which means that they can spread into, or invade nearby tissues. In addition, as these tumours grow, some cancer cells break off and travel to distant parts of the body and form new tumours far from the original site.

Accordingly, although every cell of a particular cancer is related to the same original “parent” cell, the mixture of cells within a tumour becomes increasingly complex. The idea that different kinds of cells make up one cancer is called “tumour heterogeneity”, and in practice means that every cancer is unique. So two people with, say, lung cancer who are the same age, height, weight, and ethnicity, and who have similar medical histories, will almost certainly have two very different cancers.

By the time a cancer tumour is 1cm in diameter, the millions of cells within it are very different from each other, and each cancer has its own genetic identity created by the DNA in its cells.

This, of course, makes the treatment of cancer incredibly difficult and explains why scientific breakthroughs in the understanding of cancer biology do not always lead to significant improvements in overall survival rates.

How will cancer treatment change?

Precision medicine is an approach to patient care that allows doctors to select the best treatments for patients based on a genetic understanding of their disease. The idea of precision medicine is not new, but recent advances in science and technology have allowed the ideas to be brought more fully into clinical use.

Normally, when a patient is diagnosed with cancer, he or she receives a standard treatment based on previous experience of treating that disease. But typically, different people respond to treatments differently, and until recently doctors didn’t know why. But now the understanding that the genetic changes within one person’s cancer may not occur in others with the same type of cancer has led to a better understanding of which treatments will be most effective.

At the simplest level, this understanding allows targeted therapy against cancer, in which drugs (quite often complex biological molecules) are used to target very specific genetic changes in cancer cells. For example, around 15–20% of malignant breast cancers contain cells with a higher than normal level of a protein called HER2 on their surface, which stimulates them to grow. When combined with a suitable test, it means that not only can the drug be given to those patients most likely to benefit, but also the drug, with its associated side effects, need not be given to patients who will not benefit from its use.

So genetic medicine has already transformed the treatment of some cancer patients. The advent of widespread genetic medicine within the NHS is likely to lead to significant benefits for cancer patients, including:

• The identification of patients who are most likely to benefit from particular cancer therapy.

• The avoidance of unnecessary treatments that are less likely to work for specific groups of patients.

• The development of novel therapies targeted at specific tumour cells or cellular pathways.

Not only will precision medicine allow the development of precise and effective treatment strategies for cancer patients whilst improving the overall quality of life, but it will also finally destroy the myth of ‘one size fits all’ cancer therapy.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save


With surprisingly little fanfare, in October 2018 NHS England became the first health service in the world to routinely offer genetic medicine in the fight to treat cancer.

From that date, hospitals across England have been linked to specialist centres that can read, analyse and interpret DNA isolated from patients with cancer. Through this service, cancer patients can be screened for the existence of key mutations within their tumours that can indicate the best drugs for treatment or to point towards clinical trials of experimental therapies that may be beneficial.

The move marks a big step towardprecision medicine, which offers more effective therapies that are tailored to individual patients.

What is the science underpinning this move?


Firstly, a quick crash course in cancer biology:

  • Cells are the building blocks of every living organism. The instructions (or genes) that tell a cell how to develop and what to do are encoded in long linear molecules of DNA found in the nucleus of the cell.
  • These DNA molecules can be damaged over time or through exposure to chemicals or environmental changes. Cells become cancerous when specific changes in the DNA, called ‘driver mutations’, tell cells to grow faster and behave abnormally.
  • Many cancers form solid tumours, which are masses of tissue, while cancers of the blood, such as leukaemia, generally do not form solid tumours.
  • As these cancer cells multiply to form a tumour, selective pressure increases the number and type of harmful mutations found within the DNA.
  • The cells may acquire additional properties through mutation, such as malignancy which means that they can spread into, or invade nearby tissues. In addition, as these tumours grow, some cancer cells break off and travel to distant parts of the body and form new tumours far from the original site.


Accordingly, although every cell of a particular cancer is related to the same original “parent” cell, the mixture of cells within a tumour becomes increasingly complex. The idea that different kinds of cells make up one cancer is called “tumour heterogeneity”, and in practice means that every cancer is unique. So two people with, say, lung cancer who are the same age, height, weight, and ethnicity, and who have similar medical histories, will almost certainly have two very different cancers.

By the time a cancer tumour is 1cm in diameter, the millions of cells within it are very different from each other, and each cancer has its own genetic identity created by the DNA in its cells.


This, of course, makes the treatment of cancer incredibly difficult and explains why scientific breakthroughs in the understanding of cancer biology do not always lead to significant improvements in overall survival rates.

How will cancer treatment change?

Precision medicine is an approach to patient care that allows doctors to select the best treatments for patients based on a genetic understanding of their disease. The idea of precision medicine is not new, but recent advances in science and technology have allowed the ideas to be brought more fully into clinical use.

Normally, when a patient is diagnosed with cancer, he or she receives a standard treatment based on previous experience of treating that disease. But typically, different people respond to treatments differently, and until recently doctors didn’t know why. But now the understanding that the genetic changes within one person’s cancer may not occur in others with the same type of cancer has led to a better understanding of which treatments will be most effective.

At the simplest level, this understanding allows targeted therapy against cancer, in which drugs (quite often complex biological molecules) are used to target very specific genetic changes in cancer cells. For example, around 15–20% of malignant breast cancers contain cells with a higher than normal level of a protein called HER2 on their surface, which stimulates them to grow. When combined with a suitable test, it means that not only can the drug be given to those patients most likely to benefit, but also the drug, with its associated side effects, need not be given to patients who will not benefit from its use.

So genetic medicine has already transformed the treatment of some cancer patients. The advent of widespread genetic medicine within the NHS is likely to lead to significant benefits for cancer patients, including:

• The identification of patients who are most likely to benefit from particular cancer therapy.

• The avoidance of unnecessary treatments that are less likely to work for specific groups of patients.

• The development of novel therapies targeted at specific tumour cells or cellular pathways.


Not only will precision medicine allow the development of precise and effective treatment strategies for cancer patients whilst improving the overall quality of life, but it will also finally destroy the myth of ‘one size fits all’ cancer therapy.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Smoke Without Fire: How Safe are e-Cigarettes?

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

30th May 2018

Home » Medical & Healthcare

Tradition has it that on 27th July 1586 Sir Walter Raleigh introduced smoking to England, arriving with colonists bringing tobacco, maize and potatoes from Virginia. It is likely however that tobacco had already been smoked by Spanish and Portuguese sailors for many years since its discovery by the native people of Central and South America thousands of years previously.

Fast forward nearly 400 years to 1963, when Herbert A. Gilbert invented and patented the first e-cigarette as an alternative to burning tobacco. His design for a smokeless, non-tobacco cigarette incorporated flavour cartridges, a heating element, and smokeless flavoured air. However in the 60’s there was no pressing demand for a healthier alternative to smoking, and it was not a commercial success. It wasn’t until the 2000s that Hon Lik, a Chinese pharmacist and part-time medical researcher, created the first modern e-cigarette as a practical device containing nicotine dissolved in a solvent. Hon Lik was inspired to develop his concept after witnessing his father’s death from smoking-induced lung cancer.

e-Cigarette Technology

Nowadays there are many brands of e-cigarette, and all are essentially battery-powered devices, usually cylindrical in shape, containing a solution of liquid nicotine, water, and propylene glycol. When you take a puff on one, a microphone detects a drop in pressure, causing a battery to heat up the solution rapidly and create a vapour that can be inhaled. The action, called “vaping”, is increasingly becoming the preferred way to consume nicotine, with over 3 million people in the UK using e-cigarettes, either as a tobacco substitute or as a means to cut back on smoking.

The technology around vaping continues to advance: the ability to control temperature avoids overheating the carrier liquids, or causing a ‘dry puff’, in which the wick becomes too dry, and burns the ingredients rather than producing vapour. Other enhancements range from improved battery life, to the use of visual displays and Bluetooth connectivity to display and transfer information about vaping parameters and activities.

Image courtesy of Science Focus (www.sciencefocus.com)


Safety Pros and Cons

The increase in usage over recent years has been paralleled by a debate about whether vaping can be considered safe, or just safer than smoking cigarettes, and what role it should have in smoking cessation. The active ingredient, nicotine, which is crucial to cigarette addiction is not considered carcinogenic, although it is formally a toxin which in high doses can potentially affect adolescent brain development or cause harm to a developing foetus, so it can never be deemed entirely safe. But importantly e-cigarettes contain far fewer of the harmful substances such as tar or carbon monoxide produced by smoking tobacco, and therefore presumably provides a safer experience.

Because the trend in vaping has developed so fast, clinical research into the practice is struggling to catch up. One approach is to analyse e-cigarette liquids, and the vapour they produce, to demonstrate that they contain lower levels of toxic chemicals than tobacco cigarettes. However, it is more important to show what concentrations of chemicals users are actually exposed to in the real world. Such studies can be difficult and complex to conduct, often involving comparisons between vapers, tobacco smokers, non-smokers, and even to those using nicotine replacement therapy. In general these studies, as summarised in a recent Cochrane Review, have demonstrated that e-cigarettes are far safer than smoking, although the most significant benefits come from stopping smoking altogether.

Regulation of e-Cigarette Use

There is considerable variability of regulation of e-cigarettes in different countries, ranging from no regulation to banning them entirely. The unregulated manufacture of e-liquids in countries such as China has led to legitimate concerns over potential health impacts, and there is mounting pressure for world-wide alignment of regulation as exists with traditional tobacco products. It was not until 2014 that the EU required standardization and quality control of liquids and vaporizers, disclosure of ingredients in liquids, and tamper-proof packaging. Similarly in 2016 the US FDA announced the comprehensive regulation of all electronic nicotine delivery systems.

One result of this regulation is the need for rigorous product development and testing by e-cigarette companies, who are generating increasing amounts of data to demonstrate the integrity of their products. It is inevitable that as the industry matures it will begin to develop its own Quality Standards and operate under specific GxP standards for improved quality control.

In Conclusion

Over 100,000 people die each year as a result of smoking-related illnesses in the UK alone. Vaping, on the other hand, has not been linked with a single death in the UK. The advice from Cancer Research UK is that smoking tobacco is the single biggest preventable cause of death in the world, and if you are a smoker, the best thing you can do for your health is to stop. But through 50 years of development, vaping technology has created a significantly safer alternative to traditional smoking and an effective tool for helping people to stop smoking.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Tradition has it that on 27th July 1586 Sir Walter Raleigh introduced smoking to England, arriving with colonists bringing tobacco, maize and potatoes from Virginia. It is likely however that tobacco had already been smoked by Spanish and Portuguese sailors for many years since its discovery by the native people of Central and South America thousands of years previously.

Fast forward nearly 400 years to 1963, when Herbert A. Gilbert invented and patented the first e-cigarette as an alternative to burning tobacco. His design for a smokeless, non-tobacco cigarette incorporated flavour cartridges, a heating element, and smokeless flavoured air. However in the 60’s there was no pressing demand for a healthier alternative to smoking, and it was not a commercial success. It wasn’t until the 2000s that Hon Lik, a Chinese pharmacist and part-time medical researcher, created the first modern e-cigarette as a practical device containing nicotine dissolved in a solvent. Hon Lik was inspired to develop his concept after witnessing his father’s death from smoking-induced lung cancer.

e-Cigarette Technology

Nowadays there are many brands of e-cigarette, and all are essentially battery-powered devices, usually cylindrical in shape, containing a solution of liquid nicotine, water, and propylene glycol. When you take a puff on one, a microphone detects a drop in pressure, causing a battery to heat up the solution rapidly and create a vapour that can be inhaled. The action, called “vaping”, is increasingly becoming the preferred way to consume nicotine, with over 3 million people in the UK using e-cigarettes, either as a tobacco substitute or as a means to cut back on smoking.

The technology around vaping continues to advance: the ability to control temperature avoids overheating the carrier liquids, or causing a ‘dry puff’, in which the wick becomes too dry, and burns the ingredients rather than producing vapour. Other enhancements range from improved battery life, to the use of visual displays and Bluetooth connectivity to display and transfer information about vaping parameters and activities.

Image courtesy of Science Focus (www.sciencefocus.com)


Safety Pros and Cons

The increase in usage over recent years has been paralleled by a debate about whether vaping can be considered safe, or just safer than smoking cigarettes, and what role it should have in smoking cessation. The active ingredient, nicotine, which is crucial to cigarette addiction is not considered carcinogenic, although it is formally a toxin which in high doses can potentially affect adolescent brain development or cause harm to a developing foetus, so it can never be deemed entirely safe. But importantly e-cigarettes contain far fewer of the harmful substances such as tar or carbon monoxide produced by smoking tobacco, and therefore presumably provides a safer experience.

Because the trend in vaping has developed so fast, clinical research into the practice is struggling to catch up. One approach is to analyse e-cigarette liquids, and the vapour they produce, to demonstrate that they contain lower levels of toxic chemicals than tobacco cigarettes. However, it is more important to show what concentrations of chemicals users are actually exposed to in the real world. Such studies can be difficult and complex to conduct, often involving comparisons between vapers, tobacco smokers, non-smokers, and even to those using nicotine replacement therapy. In general these studies, as summarised in a recent Cochrane Review, have demonstrated that e-cigarettes are far safer than smoking, although the most significant benefits come from stopping smoking altogether.

Regulation of e-Cigarette Use

There is considerable variability of regulation of e-cigarettes in different countries, ranging from no regulation to banning them entirely. The unregulated manufacture of e-liquids in countries such as China has led to legitimate concerns over potential health impacts, and there is mounting pressure for world-wide alignment of regulation as exists with traditional tobacco products. It was not until 2014 that the EU required standardization and quality control of liquids and vaporizers, disclosure of ingredients in liquids, and tamper-proof packaging. Similarly in 2016 the US FDA announced the comprehensive regulation of all electronic nicotine delivery systems.

One result of this regulation is the need for rigorous product development and testing by e-cigarette companies, who are generating increasing amounts of data to demonstrate the integrity of their products. It is inevitable that as the industry matures it will begin to develop its own Quality Standards and operate under specific GxP standards for improved quality control.

In Conclusion

Over 100,000 people die each year as a result of smoking-related illnesses in the UK alone. Vaping, on the other hand, has not been linked with a single death in the UK. The advice from Cancer Research UK is that smoking tobacco is the single biggest preventable cause of death in the world, and if you are a smoker, the best thing you can do for your health is to stop. But through 50 years of development, vaping technology has created a significantly safer alternative to traditional smoking and an effective tool for helping people to stop smoking.

Save

Save

Save

Save

Save

Save

Save

Save

Save

The Future of Disposable Medical Devices

The Future of Disposable Medical Devices

By: Polly Britton
Project Engineer, Product Design

7th March 2018

Home » Medical & Healthcare

As one of the few sectors where waste produced is increasing year by year, there is a huge interest in the healthcare industry for disposable devices. The cost benefits of lower lifetime devices include maintenance, sterilisation and ease of convenience. These benefits are driving interest in and demand for medical devices akin to the “razor and cartridge” product model – inherently designed and produced to be, part or completely, disposable.

In the case of healthcare waste, the danger of cross-infection from re-using devices or recycling the waste is considered more important than the conservation of materials and energy. This is why many tools and devices used in hospitals are disposed of after a single use, and all the waste is incinerated. Based on current trends, the amount of waste produced by the healthcare industry is likely to increase over coming years, as more medical products become disposable. This trend can already be observed by looking at the increase in medical waste year-on-year. It is also possible that it may also decrease eventually.

Why make a product disposable?

Despite the culture of being environmentally conscious, all businesses ultimately have to follow financial incentives in order to be competitive in the market. When deciding whether to reuse a product in most industries, the main factors that need to be considered are:

A. How much does the product cost to purchase?
B. How much will it cost to store the product between uses and prepare it for its next use?

The word “cost” here means not only the monetary cost, but the cost in effort and time spent by whoever does the purchasing in the case of “A”, and whoever does the using, storing, and preparation of the product in the case of “B”.

If the “reuse cost”, B, is higher than “purchase cost”, A, the product is usually disposed of after every use. If A is bigger than B the product is kept and re-used. This is similar to the calculation to determine whether to fix a product when it breaks or to buy a new one: Is the cost to repair the product greater than or less than the value of the product?

What makes healthcare products different?

In the case of the healthcare industry, products often need to be sterilised before being used, which is especially important if the product has previously been used on another patient. The cost of disinfecting equipment is high since the staff doing the work need to be trained professionals. This makes the cost to reuse very high compared to other industries. Even after rigorous cleaning and disinfecting, the risk of cross-contamination cannot be eliminated completely, which introduces an additional factor: “risk to patient’s health”, which cannot be quantitatively compared to factors “A” and “B”.

It is the high reuse cost and the additional risk to patients’ health that has resulted in so many healthcare products being designed as single-use, such as gloves, paper gowns, syringes, and some surgical tools. Some of this waste is considered “hazardous” officially and therefore cannot be legally disposed of in landfills, so almost all healthcare waste is incinerated, including a lot of non-hazardous waste produced by hospitals, which is not kept separately.

Why might healthcare products become more disposable?

Advances in manufacturing and automation have decreased the production cost of many products, which has reduced their purchase prices. If this trend continues, products that are now considered too valuable to throw away will become so inexpensive that they will start to be considered disposable. This could include electrical products and complex surgical tools. Furthermore, once these products are designed specifically to be single-use they can be made from cheaper materials and processes that will bring the price down even more.

Why might healthcare products become more disposable?

Although disposing of more waste by incineration causes concern for the environment, in the case of medical technology, keeping costs low allows more people to have access to effective healthcare.

How could products become less disposable?

There is also a way that future advances in technology might reduce the cost of reusing products in the future and hence reduce the incentive to dispose of products in the healthcare industry. If automation can be introduced into the disinfection process for medical products the requirement for trained staff to clean the equipment manually could be greatly reduced, and an automated disinfecting process might even be more effective at reducing the risk of cross-infection. What’s more, the disinfection process could be combined with an automated inventory management system of the type already seen in other industries.

Government regulations and incentives relating to environmental concerns could also have a big impact on the market. For example, medical products in the UK are currently exempt from the Waste Electrical and Electrical Equipment (WEEE) Directive but if that were to change, the cost of making any electrical medical devices disposable would increase.

Conclusion

The future of disposable medical devices is hard to predict since the market is driven by new technology while at the same time advances in technology are driven by market demands. With the advance of inexpensive manufacturing technology, more products may become disposable, but advances in automated sorting, cleaning, and storing could have the opposite effect. In addition, the culture of concern for the environment could also drive the government to change the relevant regulations.

For at least the short-term future, it seems more medical devices will become disposable and medical waste will continue to increase in volume per patient. However, any predictions about the healthcare market more than 20 years from now can only be speculative, due to the fast-paced nature of technological improvements.

As one of the few sectors where waste produced is increasing year by year, there is a huge interest in the healthcare industry for disposable devices. The cost benefits of lower lifetime devices include maintenance, sterilisation and ease of convenience. These benefits are driving interest in and demand of medical devices akin to the “razor and cartridge” product model – inherently designed and produced to be, part or completely, disposable.

In the case of healthcare waste, the danger of cross-infection from re-using devices or recycling the waste is considered more important than the conservation of materials and energy. This is why many tools and devices used in hospitals are disposed of after a single use, and all the waste is incinerated. Based on current trends, the amount of waste produced by the healthcare industry is likely to increase over coming years, as more medical products become disposable. This trend can already be observed by looking at the increase in medical waste year-on-year. It is also possible that it may also decrease eventually.

Why make a product disposable?

Despite the culture of being environmentally conscious, all businesses ultimately have to follow financial incentives in order to be competitive in the market. When deciding whether to reuse a product in most industries, the main factors that need to be considered are:

A. How much does the product cost to purchase?
B. How much will it cost to store the product between uses and prepare it for its next use?

The word “cost” here means not only the monetary cost, but the cost in effort and time spent by whoever does the purchasing in the case of “A”, and whoever does the using, storing, and preparation of the product in the case of “B”.

If the “reuse cost”, B, is higher than “purchase cost”, A, the product is usually disposed of after every use. If A is bigger than B the product is kept and re-used. This is similar to the calculation to determine whether to fix a product when it breaks or to buy a new one: Is the cost to repair the product greater than or less than the value of the product?

What makes healthcare products different?

In the case of the healthcare industry, products often need to be sterilised before being used, which is especially important if the product has previously been used on another patient. The cost of disinfecting equipment is high since the staff doing the work need to be trained professionals. This makes the cost to reuse very high compared to other industries. Even after rigorous cleaning and disinfecting, the risk of cross-contamination cannot be eliminated completely, which introduces an additional factor: “risk to patient’s health”, which cannot be quantitatively compared to factors “A” and “B”.

It is the high reuse cost and the additional risk to patients’ health that has resulted in so many healthcare products being designed as single-use, such as gloves, paper gowns, syringes, and some surgical tools. Some of this waste is considered “hazardous” officially and therefore cannot be legally disposed of in landfills, so almost all healthcare waste is incinerated, including a lot of non-hazardous waste produced by hospitals, which is not kept separately.

Why might healthcare products become more disposable?

Advances in manufacturing and automation have decreased the production cost of many products, which has reduced their purchase prices. If this trend continues, products that are now considered too valuable to throw away will become so inexpensive that they will start to be considered disposable. This could include electrical products and complex surgical tools. Furthermore, once these products are designed specifically to be single-use they can be made from cheaper materials and processes that will bring the price down even more.

Although disposing of more waste by incineration causes concern for the environment, in the case of medical technology, keeping costs low allows more people to have access to effective healthcare.

How could products become less disposable?

There is also a way that future advances in technology might reduce the cost of reusing products in the future and hence reduce the incentive to dispose of products in the healthcare industry. If automation can be introduced into the disinfection process for medical products the requirement for trained staff to clean the equipment manually could be greatly reduced, and an automated disinfecting process might even be more effective at reducing the risk of cross-infection. What’s more, the disinfection process could be combined with an automated inventory management system of the type already seen in other industries.

Government regulations and incentives relating to environmental concerns could also have a big impact on the market. For example, medical products in the UK are currently exempt from the Waste Electrical and Electrical Equipment (WEEE) Directive but if that were to change, the cost of making any electrical medical devices disposable would increase.

Conclusion

The future of disposable medical devices is hard to predict since the market is driven by new technology while at the same time advances in technology are driven by market demands. With the advance of inexpensive manufacturing technology, more products may become disposable, but advances in automated sorting, cleaning, and storing could have the opposite effect. In addition, the culture of concern for the environment could also drive the government to change the relevant regulations.

For at least the short-term future, it seems more medical devices will become disposable and medical waste will continue to increase in volume per patient. However, any predictions about the healthcare market more than 20 years from now can only be speculative, due to the fast-paced nature of technological improvements.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Wearable Technology Enters the Clinic

Wearable Technology Enters the Clinic

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

13th December 2017

Home » Medical & Healthcare

Our continuing fascination with electronic gadgets and a growing awareness of health and fitness issues will, undoubtedly, drive huge sales of wearable fitness devices this Christmas. The range of such devices from FitBit, Jawbone, Garmin and the like is continually expanding, as are their capabilities. They can now routinely measure heart rate, blood pressure, step count, sleep quality and temperature, while integration and connection with Bluetooth-enabled smartphones allows further capabilities to monitor and analyse fitness data.

The rising prevalence of lifestyle diseases, such as obesity, has led to an interest from consumers and doctors in using wearables to alleviate these conditions, and also to a growing realisation that they can be valuable in monitoring patients in formal clinical studies. Although most wearables are targeted firmly at the commercial market, many such devices are now being specifically designed with a primary function of medical monitoring in mind.

The distinguishing feature of these devices, apart from the need to manufacture to the appropriate quality standard, is often the incorporation of embedded technology, which can facilitate the secure collection and transfer of large amounts of data wirelessly. This allows health data from the subject to be collected effortlessly in the background, in the ‘real world’, thereby increasing the validity of the data and lowering the impact on both patient and clinical centres.

Pharmaceutical companies are always on the look-out for ways to manage costs to meet the increasing complexity of drug development, and use of wearable devices in clinical trials represents one potential area where savings can be made. Perhaps, unsurprisingly, there are now over 300 clinical studies involving the use of wearable devices for monitoring a range of medical conditions. Wearable devices are arguably most useful in monitoring highly prevalent chronic illnesses such as diabetes, hypertension, congestive heart failure, and chronic obstructive pulmonary disease. But, in principle, wearables can also be used in all diseases where outcomes can be measured in terms of improved vital signs and enhanced movement.

An area that is likely to see a rapid expansion is in neurological conditions such as Parkinson’s Disease and Alzheimer’s Disease, where analysis of gait and movement can provide powerful insights into disease progression. In addition, wearables are also useful in clinical studies where the patient needs to keep a diary, by simplifying the record-taking process, and by providing prompts and reminders to improve compliance with the treatment schedule.

One fascinating development is the recent approval of the Abilify MyCite digital medicine/wearable combination, in which an anti-psychotic medicine is formulated within, in a pill containing a tiny sensor that, after activation by stomach acid, sends a unique identifying signal to a patch worn on the patient’s chest. The patch automatically logs the date and time of the signal and can transmit information via Bluetooth to a paired mobile device. However, a number of commentators have suggested that it is somewhat odd to attempt this medical approach with patients who may suffer from paranoid beliefs and delusions about being spied upon…

The primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines.”


With the prospect of developing richer and more complex patient health profiles, the primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines. Other benefits include reduced costs for visits to clinical centres, access to a wider pool of geographically dispersed patients, and potentially decreased variability of data – allowing fewer patients to be recruited into the study in order to achieve the required statistical significance.

More speculatively, developments in predictive analytics offer the potential of alerting researchers to damaging side effects (Adverse Events in medical terminology) before they occur, based on data analysed from individuals and patient groups. The continuous collection of data from wearables will undoubtedly provide valuable insights into a patient’s well-being, potentially by associating activity levels or spikes in blood pressure with drug dosing. But the real long-term significance is the opportunity to aggregate and integrate the data into a comprehensive and holistic model of patient well-being, generating new insights into a range of disease states, and suggesting new approaches to treatment.

There are, of course, problems associated with the use of wearable technology in clinical studies. This includes the uncertain regulatory status of the devices themselves, the overwhelming amount of data that will be collected for analysis, and concerns over patient data security and ownership. Moreover, the pharmaceutical industry tends to be a conservative and risk-averse industry, perhaps due to strong regulatory oversight and concerns over potential litigation. As a result, the use of wearable technology to study illnesses and their treatment has arguably lagged behind other industries.

However, this may be set to change as pharma companies start to engage, and more technology companies enter clinical research and drive the use of wearables. As the technology develops, the collection, aggregation, and analysis of data will surely transform our understanding of diseases and treatment options, to the benefit of drug developers and patients alike.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Our continuing fascination with electronic gadgets and a growing awareness of health and fitness issues will, undoubtedly, drive huge sales of wearable fitness devices this Christmas. The range of such devices from FitBit, Jawbone, Garmin and the like is continually expanding, as are their capabilities. They can now routinely measure heart rate, blood pressure, step count, sleep quality and temperature, while integration and connection with Bluetooth-enabled smartphones allows further capabilities to monitor and analyse fitness data.

The rising prevalence of lifestyle diseases, such as obesity, has led to an interest from consumers and doctors in using wearables to alleviate these conditions, and also to a growing realisation that they can be valuable in monitoring patients in formal clinical studies. Although most wearables are targeted firmly at the commercial market, many such devices are now being specifically designed with a primary function of medical monitoring in mind.

The distinguishing feature of these devices, apart from the need to manufacture to the appropriate quality standard, is often the incorporation of embedded technology, which can facilitate the secure collection and transfer of large amounts of data wirelessly. This allows health data from the subject to be collected effortlessly in the background, in the ‘real world’, thereby increasing the validity of the data and lowering the impact on both patient and clinical centres.

Pharmaceutical companies are always on the look-out for ways to manage costs to meet the increasing complexity of drug development, and use of wearable devices in clinical trials represents one potential area where savings can be made. Perhaps, unsurprisingly, there are now over 300 clinical studies involving the use of wearable devices for monitoring a range of medical conditions. Wearable devices are arguably most useful in monitoring highly prevalent chronic illnesses such as diabetes, hypertension, congestive heart failure, and chronic obstructive pulmonary disease. But, in principle, wearables can also be used in all diseases where outcomes can be measured in terms of improved vital signs and enhanced movement.

An area that is likely to see a rapid expansion is in neurological conditions such as Parkinson’s Disease and Alzheimer’s Disease, where analysis of gait and movement can provide powerful insights into disease progression. In addition, wearables are also useful in clinical studies where the patient needs to keep a diary, by simplifying the record-taking process, and by providing prompts and reminders to improve compliance with the treatment schedule.

One fascinating development is the recent approval of the Abilify MyCite digital medicine/wearable combination, in which an anti-psychotic medicine is formulated within, in a pill containing a tiny sensor that, after activation by stomach acid, sends a unique identifying signal to a patch worn on the patient’s chest. The patch automatically logs the date and time of the signal and can transmit information via Bluetooth to a paired mobile device. However, a number of commentators have suggested that it is somewhat odd to attempt this medical approach with patients who may suffer from paranoid beliefs and delusions about being spied upon…

The primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines.”


With the prospect of developing richer and more complex patient health profiles, the primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines. Other benefits include reduced costs for visits to clinical centres, access to a wider pool of geographically dispersed patients, and potentially decreased variability of data – allowing fewer patients to be recruited into the study in order to achieve the required statistical significance.

More speculatively, developments in predictive analytics offer the potential of alerting researchers to damaging side effects (Adverse Events in medical terminology) before they occur, based on data analysed from individuals and patient groups. The continuous collection of data from wearables will undoubtedly provide valuable insights into a patient’s well-being, potentially by associating activity levels or spikes in blood pressure with drug dosing. But the real long-term significance is the opportunity to aggregate and integrate the data into a comprehensive and holistic model of patient well-being, generating new insights into a range of disease states, and suggesting new approaches to treatment.

There are, of course, problems associated with the use of wearable technology in clinical studies. This includes the uncertain regulatory status of the devices themselves, the overwhelming amount of data that will be collected for analysis, and concerns over patient data security and ownership. Moreover, the pharmaceutical industry tends to be a conservative and risk-averse industry, perhaps due to strong regulatory oversight and concerns over potential litigation. As a result, the use of wearable technology to study illnesses and their treatment has arguably lagged behind other industries.

However, this may be set to change as pharma companies start to engage, and more technology companies enter clinical research and drive the use of wearables. As the technology develops, the collection, aggregation, and analysis of data will surely transform our understanding of diseases and treatment options, to the benefit of drug developers and patients alike.

Save

Save

Save

Save

Save

Save

Save

Save

Save