Smoke Without Fire: How Safe are e-Cigarettes?

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

30th May 2018

Home » Medical & Healthcare » Page 2

Tradition has it that on 27th July 1586 Sir Walter Raleigh introduced smoking to England, arriving with colonists bringing tobacco, maize and potatoes from Virginia. It is likely however that tobacco had already been smoked by Spanish and Portuguese sailors for many years since its discovery by the native people of Central and South America thousands of years previously.

Fast forward nearly 400 years to 1963, when Herbert A. Gilbert invented and patented the first e-cigarette as an alternative to burning tobacco. His design for a smokeless, non-tobacco cigarette incorporated flavour cartridges, a heating element, and smokeless flavoured air. However in the 60’s there was no pressing demand for a healthier alternative to smoking, and it was not a commercial success. It wasn’t until the 2000s that Hon Lik, a Chinese pharmacist and part-time medical researcher, created the first modern e-cigarette as a practical device containing nicotine dissolved in a solvent. Hon Lik was inspired to develop his concept after witnessing his father’s death from smoking-induced lung cancer.

e-Cigarette Technology

Nowadays there are many brands of e-cigarette, and all are essentially battery-powered devices, usually cylindrical in shape, containing a solution of liquid nicotine, water, and propylene glycol. When you take a puff on one, a microphone detects a drop in pressure, causing a battery to heat up the solution rapidly and create a vapour that can be inhaled. The action, called “vaping”, is increasingly becoming the preferred way to consume nicotine, with over 3 million people in the UK using e-cigarettes, either as a tobacco substitute or as a means to cut back on smoking.

The technology around vaping continues to advance: the ability to control temperature avoids overheating the carrier liquids, or causing a ‘dry puff’, in which the wick becomes too dry, and burns the ingredients rather than producing vapour. Other enhancements range from improved battery life, to the use of visual displays and Bluetooth connectivity to display and transfer information about vaping parameters and activities.

Image courtesy of Science Focus (www.sciencefocus.com)


Safety Pros and Cons

The increase in usage over recent years has been paralleled by a debate about whether vaping can be considered safe, or just safer than smoking cigarettes, and what role it should have in smoking cessation. The active ingredient, nicotine, which is crucial to cigarette addiction is not considered carcinogenic, although it is formally a toxin which in high doses can potentially affect adolescent brain development or cause harm to a developing foetus, so it can never be deemed entirely safe. But importantly e-cigarettes contain far fewer of the harmful substances such as tar or carbon monoxide produced by smoking tobacco, and therefore presumably provides a safer experience.

Because the trend in vaping has developed so fast, clinical research into the practice is struggling to catch up. One approach is to analyse e-cigarette liquids, and the vapour they produce, to demonstrate that they contain lower levels of toxic chemicals than tobacco cigarettes. However, it is more important to show what concentrations of chemicals users are actually exposed to in the real world. Such studies can be difficult and complex to conduct, often involving comparisons between vapers, tobacco smokers, non-smokers, and even to those using nicotine replacement therapy. In general these studies, as summarised in a recent Cochrane Review, have demonstrated that e-cigarettes are far safer than smoking, although the most significant benefits come from stopping smoking altogether.

Regulation of e-Cigarette Use

There is considerable variability of regulation of e-cigarettes in different countries, ranging from no regulation to banning them entirely. The unregulated manufacture of e-liquids in countries such as China has led to legitimate concerns over potential health impacts, and there is mounting pressure for world-wide alignment of regulation as exists with traditional tobacco products. It was not until 2014 that the EU required standardization and quality control of liquids and vaporizers, disclosure of ingredients in liquids, and tamper-proof packaging. Similarly in 2016 the US FDA announced the comprehensive regulation of all electronic nicotine delivery systems.

One result of this regulation is the need for rigorous product development and testing by e-cigarette companies, who are generating increasing amounts of data to demonstrate the integrity of their products. It is inevitable that as the industry matures it will begin to develop its own Quality Standards and operate under specific GxP standards for improved quality control.

In Conclusion

Over 100,000 people die each year as a result of smoking-related illnesses in the UK alone. Vaping, on the other hand, has not been linked with a single death in the UK. The advice from Cancer Research UK is that smoking tobacco is the single biggest preventable cause of death in the world, and if you are a smoker, the best thing you can do for your health is to stop. But through 50 years of development, vaping technology has created a significantly safer alternative to traditional smoking and an effective tool for helping people to stop smoking.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Tradition has it that on 27th July 1586 Sir Walter Raleigh introduced smoking to England, arriving with colonists bringing tobacco, maize and potatoes from Virginia. It is likely however that tobacco had already been smoked by Spanish and Portuguese sailors for many years since its discovery by the native people of Central and South America thousands of years previously.

Fast forward nearly 400 years to 1963, when Herbert A. Gilbert invented and patented the first e-cigarette as an alternative to burning tobacco. His design for a smokeless, non-tobacco cigarette incorporated flavour cartridges, a heating element, and smokeless flavoured air. However in the 60’s there was no pressing demand for a healthier alternative to smoking, and it was not a commercial success. It wasn’t until the 2000s that Hon Lik, a Chinese pharmacist and part-time medical researcher, created the first modern e-cigarette as a practical device containing nicotine dissolved in a solvent. Hon Lik was inspired to develop his concept after witnessing his father’s death from smoking-induced lung cancer.

e-Cigarette Technology

Nowadays there are many brands of e-cigarette, and all are essentially battery-powered devices, usually cylindrical in shape, containing a solution of liquid nicotine, water, and propylene glycol. When you take a puff on one, a microphone detects a drop in pressure, causing a battery to heat up the solution rapidly and create a vapour that can be inhaled. The action, called “vaping”, is increasingly becoming the preferred way to consume nicotine, with over 3 million people in the UK using e-cigarettes, either as a tobacco substitute or as a means to cut back on smoking.

The technology around vaping continues to advance: the ability to control temperature avoids overheating the carrier liquids, or causing a ‘dry puff’, in which the wick becomes too dry, and burns the ingredients rather than producing vapour. Other enhancements range from improved battery life, to the use of visual displays and Bluetooth connectivity to display and transfer information about vaping parameters and activities.

Image courtesy of Science Focus (www.sciencefocus.com)


Safety Pros and Cons

The increase in usage over recent years has been paralleled by a debate about whether vaping can be considered safe, or just safer than smoking cigarettes, and what role it should have in smoking cessation. The active ingredient, nicotine, which is crucial to cigarette addiction is not considered carcinogenic, although it is formally a toxin which in high doses can potentially affect adolescent brain development or cause harm to a developing foetus, so it can never be deemed entirely safe. But importantly e-cigarettes contain far fewer of the harmful substances such as tar or carbon monoxide produced by smoking tobacco, and therefore presumably provides a safer experience.

Because the trend in vaping has developed so fast, clinical research into the practice is struggling to catch up. One approach is to analyse e-cigarette liquids, and the vapour they produce, to demonstrate that they contain lower levels of toxic chemicals than tobacco cigarettes. However, it is more important to show what concentrations of chemicals users are actually exposed to in the real world. Such studies can be difficult and complex to conduct, often involving comparisons between vapers, tobacco smokers, non-smokers, and even to those using nicotine replacement therapy. In general these studies, as summarised in a recent Cochrane Review, have demonstrated that e-cigarettes are far safer than smoking, although the most significant benefits come from stopping smoking altogether.

Regulation of e-Cigarette Use

There is considerable variability of regulation of e-cigarettes in different countries, ranging from no regulation to banning them entirely. The unregulated manufacture of e-liquids in countries such as China has led to legitimate concerns over potential health impacts, and there is mounting pressure for world-wide alignment of regulation as exists with traditional tobacco products. It was not until 2014 that the EU required standardization and quality control of liquids and vaporizers, disclosure of ingredients in liquids, and tamper-proof packaging. Similarly in 2016 the US FDA announced the comprehensive regulation of all electronic nicotine delivery systems.

One result of this regulation is the need for rigorous product development and testing by e-cigarette companies, who are generating increasing amounts of data to demonstrate the integrity of their products. It is inevitable that as the industry matures it will begin to develop its own Quality Standards and operate under specific GxP standards for improved quality control.

In Conclusion

Over 100,000 people die each year as a result of smoking-related illnesses in the UK alone. Vaping, on the other hand, has not been linked with a single death in the UK. The advice from Cancer Research UK is that smoking tobacco is the single biggest preventable cause of death in the world, and if you are a smoker, the best thing you can do for your health is to stop. But through 50 years of development, vaping technology has created a significantly safer alternative to traditional smoking and an effective tool for helping people to stop smoking.

Save

Save

Save

Save

Save

Save

Save

Save

Save

The Future of Disposable Medical Devices

The Future of Disposable Medical Devices

By: Polly Britton
Project Engineer, Product Design

7th March 2018

Home » Medical & Healthcare » Page 2

As one of the few sectors where waste produced is increasing year by year, there is a huge interest in the healthcare industry for disposable devices. The cost benefits of lower lifetime devices include maintenance, sterilisation and ease of convenience. These benefits are driving interest in and demand for medical devices akin to the “razor and cartridge” product model – inherently designed and produced to be, part or completely, disposable.

In the case of healthcare waste, the danger of cross-infection from re-using devices or recycling the waste is considered more important than the conservation of materials and energy. This is why many tools and devices used in hospitals are disposed of after a single use, and all the waste is incinerated. Based on current trends, the amount of waste produced by the healthcare industry is likely to increase over coming years, as more medical products become disposable. This trend can already be observed by looking at the increase in medical waste year-on-year. It is also possible that it may also decrease eventually.

Why make a product disposable?

Despite the culture of being environmentally conscious, all businesses ultimately have to follow financial incentives in order to be competitive in the market. When deciding whether to reuse a product in most industries, the main factors that need to be considered are:

A. How much does the product cost to purchase?
B. How much will it cost to store the product between uses and prepare it for its next use?

The word “cost” here means not only the monetary cost, but the cost in effort and time spent by whoever does the purchasing in the case of “A”, and whoever does the using, storing, and preparation of the product in the case of “B”.

If the “reuse cost”, B, is higher than “purchase cost”, A, the product is usually disposed of after every use. If A is bigger than B the product is kept and re-used. This is similar to the calculation to determine whether to fix a product when it breaks or to buy a new one: Is the cost to repair the product greater than or less than the value of the product?

What makes healthcare products different?

In the case of the healthcare industry, products often need to be sterilised before being used, which is especially important if the product has previously been used on another patient. The cost of disinfecting equipment is high since the staff doing the work need to be trained professionals. This makes the cost to reuse very high compared to other industries. Even after rigorous cleaning and disinfecting, the risk of cross-contamination cannot be eliminated completely, which introduces an additional factor: “risk to patient’s health”, which cannot be quantitatively compared to factors “A” and “B”.

It is the high reuse cost and the additional risk to patients’ health that has resulted in so many healthcare products being designed as single-use, such as gloves, paper gowns, syringes, and some surgical tools. Some of this waste is considered “hazardous” officially and therefore cannot be legally disposed of in landfills, so almost all healthcare waste is incinerated, including a lot of non-hazardous waste produced by hospitals, which is not kept separately.

Why might healthcare products become more disposable?

Advances in manufacturing and automation have decreased the production cost of many products, which has reduced their purchase prices. If this trend continues, products that are now considered too valuable to throw away will become so inexpensive that they will start to be considered disposable. This could include electrical products and complex surgical tools. Furthermore, once these products are designed specifically to be single-use they can be made from cheaper materials and processes that will bring the price down even more.

Why might healthcare products become more disposable?

Although disposing of more waste by incineration causes concern for the environment, in the case of medical technology, keeping costs low allows more people to have access to effective healthcare.

How could products become less disposable?

There is also a way that future advances in technology might reduce the cost of reusing products in the future and hence reduce the incentive to dispose of products in the healthcare industry. If automation can be introduced into the disinfection process for medical products the requirement for trained staff to clean the equipment manually could be greatly reduced, and an automated disinfecting process might even be more effective at reducing the risk of cross-infection. What’s more, the disinfection process could be combined with an automated inventory management system of the type already seen in other industries.

Government regulations and incentives relating to environmental concerns could also have a big impact on the market. For example, medical products in the UK are currently exempt from the Waste Electrical and Electrical Equipment (WEEE) Directive but if that were to change, the cost of making any electrical medical devices disposable would increase.

Conclusion

The future of disposable medical devices is hard to predict since the market is driven by new technology while at the same time advances in technology are driven by market demands. With the advance of inexpensive manufacturing technology, more products may become disposable, but advances in automated sorting, cleaning, and storing could have the opposite effect. In addition, the culture of concern for the environment could also drive the government to change the relevant regulations.

For at least the short-term future, it seems more medical devices will become disposable and medical waste will continue to increase in volume per patient. However, any predictions about the healthcare market more than 20 years from now can only be speculative, due to the fast-paced nature of technological improvements.

As one of the few sectors where waste produced is increasing year by year, there is a huge interest in the healthcare industry for disposable devices. The cost benefits of lower lifetime devices include maintenance, sterilisation and ease of convenience. These benefits are driving interest in and demand of medical devices akin to the “razor and cartridge” product model – inherently designed and produced to be, part or completely, disposable.

In the case of healthcare waste, the danger of cross-infection from re-using devices or recycling the waste is considered more important than the conservation of materials and energy. This is why many tools and devices used in hospitals are disposed of after a single use, and all the waste is incinerated. Based on current trends, the amount of waste produced by the healthcare industry is likely to increase over coming years, as more medical products become disposable. This trend can already be observed by looking at the increase in medical waste year-on-year. It is also possible that it may also decrease eventually.

Why make a product disposable?

Despite the culture of being environmentally conscious, all businesses ultimately have to follow financial incentives in order to be competitive in the market. When deciding whether to reuse a product in most industries, the main factors that need to be considered are:

A. How much does the product cost to purchase?
B. How much will it cost to store the product between uses and prepare it for its next use?

The word “cost” here means not only the monetary cost, but the cost in effort and time spent by whoever does the purchasing in the case of “A”, and whoever does the using, storing, and preparation of the product in the case of “B”.

If the “reuse cost”, B, is higher than “purchase cost”, A, the product is usually disposed of after every use. If A is bigger than B the product is kept and re-used. This is similar to the calculation to determine whether to fix a product when it breaks or to buy a new one: Is the cost to repair the product greater than or less than the value of the product?

What makes healthcare products different?

In the case of the healthcare industry, products often need to be sterilised before being used, which is especially important if the product has previously been used on another patient. The cost of disinfecting equipment is high since the staff doing the work need to be trained professionals. This makes the cost to reuse very high compared to other industries. Even after rigorous cleaning and disinfecting, the risk of cross-contamination cannot be eliminated completely, which introduces an additional factor: “risk to patient’s health”, which cannot be quantitatively compared to factors “A” and “B”.

It is the high reuse cost and the additional risk to patients’ health that has resulted in so many healthcare products being designed as single-use, such as gloves, paper gowns, syringes, and some surgical tools. Some of this waste is considered “hazardous” officially and therefore cannot be legally disposed of in landfills, so almost all healthcare waste is incinerated, including a lot of non-hazardous waste produced by hospitals, which is not kept separately.

Why might healthcare products become more disposable?

Advances in manufacturing and automation have decreased the production cost of many products, which has reduced their purchase prices. If this trend continues, products that are now considered too valuable to throw away will become so inexpensive that they will start to be considered disposable. This could include electrical products and complex surgical tools. Furthermore, once these products are designed specifically to be single-use they can be made from cheaper materials and processes that will bring the price down even more.

Although disposing of more waste by incineration causes concern for the environment, in the case of medical technology, keeping costs low allows more people to have access to effective healthcare.

How could products become less disposable?

There is also a way that future advances in technology might reduce the cost of reusing products in the future and hence reduce the incentive to dispose of products in the healthcare industry. If automation can be introduced into the disinfection process for medical products the requirement for trained staff to clean the equipment manually could be greatly reduced, and an automated disinfecting process might even be more effective at reducing the risk of cross-infection. What’s more, the disinfection process could be combined with an automated inventory management system of the type already seen in other industries.

Government regulations and incentives relating to environmental concerns could also have a big impact on the market. For example, medical products in the UK are currently exempt from the Waste Electrical and Electrical Equipment (WEEE) Directive but if that were to change, the cost of making any electrical medical devices disposable would increase.

Conclusion

The future of disposable medical devices is hard to predict since the market is driven by new technology while at the same time advances in technology are driven by market demands. With the advance of inexpensive manufacturing technology, more products may become disposable, but advances in automated sorting, cleaning, and storing could have the opposite effect. In addition, the culture of concern for the environment could also drive the government to change the relevant regulations.

For at least the short-term future, it seems more medical devices will become disposable and medical waste will continue to increase in volume per patient. However, any predictions about the healthcare market more than 20 years from now can only be speculative, due to the fast-paced nature of technological improvements.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Wearable Technology Enters the Clinic

Wearable Technology Enters the Clinic

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

13th December 2017

Home » Medical & Healthcare » Page 2

Our continuing fascination with electronic gadgets and a growing awareness of health and fitness issues will, undoubtedly, drive huge sales of wearable fitness devices this Christmas. The range of such devices from FitBit, Jawbone, Garmin and the like is continually expanding, as are their capabilities. They can now routinely measure heart rate, blood pressure, step count, sleep quality and temperature, while integration and connection with Bluetooth-enabled smartphones allows further capabilities to monitor and analyse fitness data.

The rising prevalence of lifestyle diseases, such as obesity, has led to an interest from consumers and doctors in using wearables to alleviate these conditions, and also to a growing realisation that they can be valuable in monitoring patients in formal clinical studies. Although most wearables are targeted firmly at the commercial market, many such devices are now being specifically designed with a primary function of medical monitoring in mind.

The distinguishing feature of these devices, apart from the need to manufacture to the appropriate quality standard, is often the incorporation of embedded technology, which can facilitate the secure collection and transfer of large amounts of data wirelessly. This allows health data from the subject to be collected effortlessly in the background, in the ‘real world’, thereby increasing the validity of the data and lowering the impact on both patient and clinical centres.

Pharmaceutical companies are always on the look-out for ways to manage costs to meet the increasing complexity of drug development, and use of wearable devices in clinical trials represents one potential area where savings can be made. Perhaps, unsurprisingly, there are now over 300 clinical studies involving the use of wearable devices for monitoring a range of medical conditions. Wearable devices are arguably most useful in monitoring highly prevalent chronic illnesses such as diabetes, hypertension, congestive heart failure, and chronic obstructive pulmonary disease. But, in principle, wearables can also be used in all diseases where outcomes can be measured in terms of improved vital signs and enhanced movement.

An area that is likely to see a rapid expansion is in neurological conditions such as Parkinson’s Disease and Alzheimer’s Disease, where analysis of gait and movement can provide powerful insights into disease progression. In addition, wearables are also useful in clinical studies where the patient needs to keep a diary, by simplifying the record-taking process, and by providing prompts and reminders to improve compliance with the treatment schedule.

One fascinating development is the recent approval of the Abilify MyCite digital medicine/wearable combination, in which an anti-psychotic medicine is formulated within, in a pill containing a tiny sensor that, after activation by stomach acid, sends a unique identifying signal to a patch worn on the patient’s chest. The patch automatically logs the date and time of the signal and can transmit information via Bluetooth to a paired mobile device. However, a number of commentators have suggested that it is somewhat odd to attempt this medical approach with patients who may suffer from paranoid beliefs and delusions about being spied upon…

The primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines.”


With the prospect of developing richer and more complex patient health profiles, the primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines. Other benefits include reduced costs for visits to clinical centres, access to a wider pool of geographically dispersed patients, and potentially decreased variability of data – allowing fewer patients to be recruited into the study in order to achieve the required statistical significance.

More speculatively, developments in predictive analytics offer the potential of alerting researchers to damaging side effects (Adverse Events in medical terminology) before they occur, based on data analysed from individuals and patient groups. The continuous collection of data from wearables will undoubtedly provide valuable insights into a patient’s well-being, potentially by associating activity levels or spikes in blood pressure with drug dosing. But the real long-term significance is the opportunity to aggregate and integrate the data into a comprehensive and holistic model of patient well-being, generating new insights into a range of disease states, and suggesting new approaches to treatment.

There are, of course, problems associated with the use of wearable technology in clinical studies. This includes the uncertain regulatory status of the devices themselves, the overwhelming amount of data that will be collected for analysis, and concerns over patient data security and ownership. Moreover, the pharmaceutical industry tends to be a conservative and risk-averse industry, perhaps due to strong regulatory oversight and concerns over potential litigation. As a result, the use of wearable technology to study illnesses and their treatment has arguably lagged behind other industries.

However, this may be set to change as pharma companies start to engage, and more technology companies enter clinical research and drive the use of wearables. As the technology develops, the collection, aggregation, and analysis of data will surely transform our understanding of diseases and treatment options, to the benefit of drug developers and patients alike.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Our continuing fascination with electronic gadgets and a growing awareness of health and fitness issues will, undoubtedly, drive huge sales of wearable fitness devices this Christmas. The range of such devices from FitBit, Jawbone, Garmin and the like is continually expanding, as are their capabilities. They can now routinely measure heart rate, blood pressure, step count, sleep quality and temperature, while integration and connection with Bluetooth-enabled smartphones allows further capabilities to monitor and analyse fitness data.

The rising prevalence of lifestyle diseases, such as obesity, has led to an interest from consumers and doctors in using wearables to alleviate these conditions, and also to a growing realisation that they can be valuable in monitoring patients in formal clinical studies. Although most wearables are targeted firmly at the commercial market, many such devices are now being specifically designed with a primary function of medical monitoring in mind.

The distinguishing feature of these devices, apart from the need to manufacture to the appropriate quality standard, is often the incorporation of embedded technology, which can facilitate the secure collection and transfer of large amounts of data wirelessly. This allows health data from the subject to be collected effortlessly in the background, in the ‘real world’, thereby increasing the validity of the data and lowering the impact on both patient and clinical centres.

Pharmaceutical companies are always on the look-out for ways to manage costs to meet the increasing complexity of drug development, and use of wearable devices in clinical trials represents one potential area where savings can be made. Perhaps, unsurprisingly, there are now over 300 clinical studies involving the use of wearable devices for monitoring a range of medical conditions. Wearable devices are arguably most useful in monitoring highly prevalent chronic illnesses such as diabetes, hypertension, congestive heart failure, and chronic obstructive pulmonary disease. But, in principle, wearables can also be used in all diseases where outcomes can be measured in terms of improved vital signs and enhanced movement.

An area that is likely to see a rapid expansion is in neurological conditions such as Parkinson’s Disease and Alzheimer’s Disease, where analysis of gait and movement can provide powerful insights into disease progression. In addition, wearables are also useful in clinical studies where the patient needs to keep a diary, by simplifying the record-taking process, and by providing prompts and reminders to improve compliance with the treatment schedule.

One fascinating development is the recent approval of the Abilify MyCite digital medicine/wearable combination, in which an anti-psychotic medicine is formulated within, in a pill containing a tiny sensor that, after activation by stomach acid, sends a unique identifying signal to a patch worn on the patient’s chest. The patch automatically logs the date and time of the signal and can transmit information via Bluetooth to a paired mobile device. However, a number of commentators have suggested that it is somewhat odd to attempt this medical approach with patients who may suffer from paranoid beliefs and delusions about being spied upon…

The primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines.”


With the prospect of developing richer and more complex patient health profiles, the primary benefit of wearable devices is access to real-world, continuous measurement of a patient’s health through unobtrusive tracking during their daily routines. Other benefits include reduced costs for visits to clinical centres, access to a wider pool of geographically dispersed patients, and potentially decreased variability of data – allowing fewer patients to be recruited into the study in order to achieve the required statistical significance.

More speculatively, developments in predictive analytics offer the potential of alerting researchers to damaging side effects (Adverse Events in medical terminology) before they occur, based on data analysed from individuals and patient groups. The continuous collection of data from wearables will undoubtedly provide valuable insights into a patient’s well-being, potentially by associating activity levels or spikes in blood pressure with drug dosing. But the real long-term significance is the opportunity to aggregate and integrate the data into a comprehensive and holistic model of patient well-being, generating new insights into a range of disease states, and suggesting new approaches to treatment.

There are, of course, problems associated with the use of wearable technology in clinical studies. This includes the uncertain regulatory status of the devices themselves, the overwhelming amount of data that will be collected for analysis, and concerns over patient data security and ownership. Moreover, the pharmaceutical industry tends to be a conservative and risk-averse industry, perhaps due to strong regulatory oversight and concerns over potential litigation. As a result, the use of wearable technology to study illnesses and their treatment has arguably lagged behind other industries.

However, this may be set to change as pharma companies start to engage, and more technology companies enter clinical research and drive the use of wearables. As the technology develops, the collection, aggregation, and analysis of data will surely transform our understanding of diseases and treatment options, to the benefit of drug developers and patients alike.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Headphones Listen for Tinnitus Symptoms

Early diagnosis of tinnitus requires a visit to the audiologist, but a set of headphones paired with a smartphone app may alert consumers to this hearing condition. The technology developed by Cambridge, UK-based Plextek is intended to take tinnitus testing and prevention out of the clinical environment.

Plextek features on IEEE Electronics 360 website.

To read the full article click here.

Data holds the answer to NHS dilemma

The pressures on the NHS show no sign of abating. After the threat of yet another winter crisis, with a surge in patient numbers causing almost a third of hospital trusts in England to warn they needed urgent action to cope, could relief from some of these pressures come from a more effective use of data?

Plextek features on the Raconteur website.

To read the full article click here.