Can Small & Medium Enterprises Make a Big Environmental Impact?

Adam Roberts - Marketing Consultant

By: Adam Roberts
Marketing Consultant

8th August 2019

4 minute read

Home » Blog

Last year, Deloitte’s Global Human Capital Trends report showcased a new social, cultural and economic shift facing business leaders today.

In a roundabout explanation, this shift, dubbed “the rise of the social enterprise”, shows that businesses are being increasingly judged on their employee and stakeholder relationships (something I touched upon in my last blog), perception within their respective communities and their impact on society at large.

As you can guess from the title, let’s take a closer look at that last one.

This last decade has been building towards a tipping point for major global organisations and SMEs across all industries to pledge their efforts towards building more sustainable and environmentally friendlier operations.

In just the last five years the worldwide push for change has been staggering on all fronts. We have seen the rise of international movement Extinction Rebellion staging peaceful protests in cities around the world (2018); In 2015, the first-ever universal global climate deal was signed in Paris, with 195 countries promising to improve their carbon-cutting plans to avoid dangerous climate change. Many countries have also passed legislation to cut down or ban entirely single-use plastics as well as seek solutions to lower CO2 emissions within densely populated areas where air pollution is quickly becoming a worrying concern.

In the business world, consumer goods company, Unilever will ensure that all of its agricultural materials come from sustainable sources by 2020, even warning to sell off profitable brands that do not contribute positively to society, according to CEO, Alan Jope. The restaurant chain, Wetherspoons has become the first large business to stop using paper receipts entirely, with many other pubs and retailers, in recent years, offering e-receipts as an alternative.

Society at large is largely in agreement – but more needs to be done and we can all play our parts in delivering a greener future.

So what can businesses, particularly SMEs, do to drive sustainability within their own companies and contribute positively to the larger landscape?

Well, this is something we have been looking into at Plextek through the formation of our new internal “Green Issues” special interest group. The mission, to improve our sustainability beyond what we already routinely do by investigating and implementing a raft of practical measures that could improve our day-to-day environmental impact. We’re also taking actions to apply our technical capability and skills to drive environmental change through the launch of our EcoTech market.

The time to act is now

What could your business do? Donate profits to charitable causes? Switch to clean energy providers? Reuse plastics and other materials within the internal production process? Switch to smart meters and encourage a stricter “power down” protocol for staff (turning off after usage)? Getting rid of your own single-use plastics in the office? On a personal level, what actions could we all individually take to change bad habits and reduce (or be at least mindful) of our own consumerism?

It would be realistic to say that if your company does decide that it is going to make significant steps towards becoming a more sustainable operation, it probably won’t make much business sense. Switching providers to clean energy, for example, will more than likely cost your company more money.

The ‘business’ case for sustainability becomes less about appealing to the benefit of the ‘business’. And we simply can’t just limit green practices to a subset that involves little cost, little risk, little disruption to routine and little effort to promote through marketing and PR channels – for risk of damage to company creditability and reputation.

Instead, we must make this a ‘people’ case, or probably more likely for an SME, a ‘person’ case. C-suite level management is in the unique position to drive change not just within their own businesses but also within their respective industries with the capability to move quicker than governments. It comes from a switch in motive and in answering questions like, “What do you believe in?”, ”What do you want to be remembered for?” and “What interests do you serve?”

Last year, Deloitte’s Global Human Capital Trends report showcased a new social, cultural and economic shift facing business leaders today.

In a roundabout explanation, this shift, dubbed “the rise of the social enterprise”, shows that businesses are being increasingly judged on their employee and stakeholder relationships (something I touched upon in my last blog), perception within their respective communities and their impact on society at large.

As you can guess from the title, let’s take a closer look at that last one.

This last decade has been building towards a tipping point for major global organisations and SMEs across all industries to pledge their efforts towards building more sustainable and environmentally friendlier operations.

In just the last five years the worldwide push for change has been staggering on all fronts. We have seen the rise of international movement Extinction Rebellion staging peaceful protests in cities around the world (2018); In 2015, the first-ever universal global climate deal was signed in Paris, with 195 countries promising to improve their carbon-cutting plans to avoid dangerous climate change. Many countries have also passed legislation to cut down or ban entirely single-use plastics as well as seek solutions to lower CO2 emissions within densely populated areas where air pollution is quickly becoming a worrying concern.

In the business world, consumer goods company, Unilever will ensure that all of its agricultural materials come from sustainable sources by 2020, even warning to sell off profitable brands that do not contribute positively to society, according to CEO, Alan Jope. The restaurant chain, Wetherspoons has become the first large business to stop using paper receipts entirely, with many other pubs and retailers, in recent years, offering e-receipts as an alternative.

Society at large is largely in agreement – but more needs to be done and we can all play our parts in delivering a greener future.

So what can businesses, particularly SMEs, do to drive sustainability within their own companies and contribute positively to the larger landscape?

Well, this is something we have been looking into at Plextek through the formation of our new internal “Green Issues” special interest group. The mission, to improve our sustainability beyond what we already routinely do by investigating and implementing a raft of practical measures that could improve our day-to-day environmental impact. We’re also taking actions to apply our technical capability and skills to drive environmental change through the launch of our EcoTech market.

The time to act is now

What could your business do? Donate profits to charitable causes? Switch to clean energy providers? Reuse plastics and other materials within the internal production process? Switch to smart meters and encourage a stricter “power down” protocol for staff (turning off after usage)? Getting rid of your own single-use plastics in the office? On a personal level, what actions could we all individually take to change bad habits and reduce (or be at least mindful) of our own consumerism?

It would be realistic to say that if your company does decide that it is going to make significant steps towards becoming a more sustainable operation, it probably won’t make much business sense. Switching providers to clean energy, for example, will more than likely cost your company more money.

The ‘business’ case for sustainability becomes less about appealing to the benefit of the ‘business’. And we simply can’t just limit green practices to a subset that involves little cost, little risk, little disruption to routine and little effort to promote through marketing and PR channels – for risk of damage to company creditability and reputation.

Instead, we must make this a ‘people’ case, or probably more likely for an SME, a ‘person’ case. C-suite level management is in the unique position to drive change not just within their own businesses but also within their respective industries with the capability to move quicker than governments. It comes from a switch in motive and in answering questions like, “What do you believe in?”, ”What do you want to be remembered for?” and “What interests do you serve?”

Further Reading

What Is 5G and How Does It Work?

By: Daniel Tomlinson
Project Engineer

18th July 2019

5 minute read

Home » Blog

As a society that is becoming increasingly dependent on data driven applications, 5G promises to provide better connectivity and faster speeds for our network devices. However, whilst the previous generations of mobile communications have been fairly analogous to each other in terms of distribution and multiple user access, 5G will be drastically different – making it a challenging system to implement. So, how does it work?

Initial Concept

Enhanced Mobile, Massive iot, low latency, the 5G Triangle
Fig 1 – The 5G Triangle

 

As with any concept, 5G was initially based on a very broad and ambiguous set of standards, which promised low latency, speeds in the region of Gbps and better connectivity. Whilst no intricacies of the system were known at the time, we knew that in order to achieve faster data rates and larger bandwidths we would have to move to higher frequencies – and this is where the problem occurs. Due to the severe amounts of atmospheric attenuation that’s experienced by high frequency signals, range and power become serious issues that our current systems aren’t capable of handling.

Range & Power

A modern GSM tower features multiple cellular base stations, that together, are designed to transmit 360⁰ horizontally and at a range in the order of tens of miles, depending on the terrain. However, if you were to consider that the received power transmitted from a cellular base station degrades with distance at a rate of…

And that by factoring in frequency, this effect worsens…

…it becomes obvious that transmitting over larger distances and at higher frequencies becomes exponentially inefficient. Therefore, a key part of the 5G overhaul would require thousands of miniature base stations to be strategically placed in dense, urban environments in order to maximise capacity with minimal obstructions.

Directivity

5G Radiation pattern
Fig 2 – Radiation Pattern of an Isotropic Antenna versus an Antenna with Gain (Dipole)

 

One way to increase the range of a transceiver, whilst keeping the power output the same, is to incorporate gain into the antenna. This is achieved by focusing the transmitted power towards a particular point as opposed to equally in all directions (isotropic).

Figure 1 shows such a comparison, in which, a dipole antenna’s energy is being focused in the direction of 180 and 0 degrees. Equation three reflects this additional factor:

However, as the essence of a wireless handset is portability, it is likely to move around a lot with the user. Therefore, a high gain 5G transmitter would still require a tracking system to ensure that it stays focused directly at the end user’s handset.

User Tracking

One solution for tracking devices could be to employ a high frequency transceiver with a phased array antenna structure. This would act as a typical base station, capable of transmitting and receiving, but an array of hundreds of small scale patch antennas (and some DSP magic) would make it capable of beamforming. This would not only allow the structure to transmit high gain signals but to also steer the beam by changing the relative phase of the output.

However, as this is a technically complex system that has yet to be implemented on such a large scale, the technology is still in its infancy and is currently being trialled in select areas only. Considerable efforts will have to be made to ensure that such a transceiver could operate in a bustling environment where multipath and body-blocking would cause strong interference.

5G in 2019

3GPP (the 3rd Generation Partnership Project) is an organisation that was established in 1998 and helped to produce the original standards for 3G. It has since gone on to produce the specs for 4G, LTE and is currently working to achieve a 5G “ready system” in 2020.

With certain service carriers already having released 5G this year in certain parts of America, 2019 will be welcoming numerous 5G handsets from several of the flagships giants like Samsung, LG, Huawei and even Xiaomi – a budget smartphone manufacturer.

As with previous generations though, only limited coverage will be available at first (and at a hefty premium), but in practice, it will be fairly similar to Wi-Fi hot-spotting. A lot of work is still required to overcome the issues as discussed above.

As a society that is becoming increasingly dependent on data driven applications, 5G promises to provide better connectivity and faster speeds for our network devices. However, whilst the previous generations of mobile communications have been fairly analogous to each other in terms of distribution and multiple user access, 5G will be drastically different – making it a challenging system to implement. So, how does it work?

Initial Concept

Enhanced Mobile, Massive iot, low latency, the 5G Triangle
Fig 1 – The 5G Triangle

As with any concept, 5G was initially based on a very broad and ambiguous set of standards, which promised low latency, speeds in the region of Gbps and better connectivity. Whilst no intricacies of the system were known at the time, we knew that in order to achieve faster data rates and larger bandwidths we would have to move to higher frequencies – and this is where the problem occurs. Due to the severe amounts of atmospheric attenuation that’s experienced by high frequency signals, range and power become serious issues that our current systems aren’t capable of handling.

Range & Power

A modern GSM tower features multiple cellular base stations, that together, are designed to transmit 360⁰ horizontally and at a range in the order of tens of miles, depending on the terrain. However, if you were to consider that the received power transmitted from a cellular base station degrades with distance at a rate of…

And that by factoring in frequency, this effect worsens…

…it becomes obvious that transmitting over larger distances and at higher frequencies becomes exponentially inefficient. Therefore, a key part of the 5G overhaul would require thousands of miniature base stations to be strategically placed in dense, urban environments in order to maximise capacity with minimal obstructions.

Directivity

5G Radiation pattern
Fig 2 – Radiation Pattern of an Isotropic Antenna versus an Antenna with Gain (Dipole)

One way to increase the range of a transceiver, whilst keeping the power output the same, is to incorporate gain in to the antenna. This is achieved by focusing the transmitted power towards a particular point as opposed to equally in all directions (isotropic).

Figure 1 shows such a comparison, in which, a dipole antenna’s energy is being focused in the direction of 180 and 0 degrees. Equation three reflects this additional factor:

However, as the essence of a wireless handset is portability, it is likely to move around a lot with the user. Therefore, a high gain 5G transmitter would still require a tracking system to ensure that it stays focused directly at the end user’s handset.

User Tracking

One solution for tracking devices could be to employ a high frequency transceiver with a phased array antenna structure. This would act as a typical base station, capable of transmitting and receiving, but an array of hundreds of small scale patch antennas (and some DSP magic) would make it capable of beamforming. This would not only allow the structure to transmit high gain signals but to also steer the beam by changing the relative phase of the output.

However, as this is a technically complex system that has yet to be implemented on such a large scale, the technology is still in its infancy and is currently being trialled in select areas only. Considerable efforts will have to be made to ensure that such a transceiver could operate in a bustling environment where multipath and body-blocking would cause strong interference.

5G in 2019

3GPP (the 3rd Generation Partnership Project) is an organisation that was established in 1998 and helped to produce the original standards for 3G. It has since gone on to produce the specs for 4G, LTE and is currently working to achieve a 5G “ready system” in 2020.

With certain service carriers already having released 5G this year in certain parts of America, 2019 will be welcoming numerous 5G handsets from several of the flagships giants like Samsung, LG, Huawei and even Xiaomi – a budget smartphone manufacturer.

As with previous generations though, only limited coverage will be available at first (and at a hefty premium), but in practice, it will be fairly similar to Wi-Fi hot-spotting. A lot of work is still required to overcome the issues as discussed above.

Further Reading

Advanced Technologies in Healthcare

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

21st March 2019

4 minute read

Home » Blog

Some of the biggest changes in the practice of medicine and healthcare over the past 70 years have resulted from improvements in the way diseases and illnesses can be diagnosed and studied. Innovative technologies now allow doctors to discover increasing amounts of detailed information about both the progression and treatment of disease, allowing new treatment options and care pathways.

The most significant developments which are likely to change the face of medicine over the next few decades include:

  • Enhanced self-management for patients and the elderly through technology support systems to empower understanding and control of conditions.
  • Improved patient access to health service infrastructure through utilisation of remote care and monitoring systems.
  • Further developments in medical imaging and the application of Artificial Intelligence systems to effectively analyse and diagnose conditions.
  • Precision medicine that can target medical interventions to specific sub-groups of patients based on genomic data.
  • Robotic surgical systems that can conduct exquisitely precise operations in difficult-to-reach anatomical areas without flagging or losing concentration.

Self-Management for Patients

Day-to-day physiological monitoring technology, driven particularly by the spread of a variety of consumer wearable devices with communication capabilities, has the ability to collect and integrate health information from a variety of sources, both medical and consumer-based. The next generation of wearables is likely to significantly blur the division between technology lifestyle accessory and medical device, as reliable non-invasive sensors for the measurement of blood pressure, blood sugar, body temperature, pulse rate, hydration level and many more become increasingly implemented within these devices. The provision and integration of these derived complex sets of data has the potential to provide valuable information, that enabling a holistic approach to healthcare. The US FDA is currently working closely with industry to facilitate the introduction and effective use of these more advanced devices.

Enhanced Patient Access

In the UK, the NHS has brought high-quality medical services to every citizen, but often at the cost of long waits for visits to the doctor when a patient is concerned about his health. The introduction of improved access systems, including video-conferencing facilities, electronic health records and AI-powered chatbots, promises to be a powerful and game-changing move. In particular, chatbots systems such as Babylon Health or Ada can provide a highly accessible medical triage procedure, which can alleviate the pressure on over-worked doctors in GP surgeries, and allow those doctors to focus on patients with more serious conditions. With increasing sophistication, these chatbots can potentially provide accurate diagnostic advice on common ailments without any human interaction or involvement. The key concern is, of course, ensuring that the algorithms operate with patient safety foremost, which requires fine tuning to balance between over-caution and under diagnosis.

Medical Imaging and Artificial Intelligence

Following admission to a hospital, a key element of modern medicine is the use of imaging systems for clinical diagnosis, and the main challenge for doctors is to interpret the complexity and dynamic changes of these images. Currently, most interpretations are performed by human experts, which can be time-consuming, expensive and suffer from human error due to visual fatigue. Recent advances in machine learning systems have demonstrated that computers can extract richer information from images, with a corresponding increase in reliability and accuracy. Eventually, Artificial Intelligence will be able to identify and extract novel features that are not discernible to human viewers, allowing enhanced capabilities for medical intervention. This will allow doctors to re-focus on their interaction with patients, which is often cited as the most valued aspect of medical intervention.

Precision Medicine

The current paradigm for medical treatment is changing through the development of powerful new tools for genome sequencing which allows scientists to understand how genes affect human health. Medical decisions can now take account of genetic information, allowing doctors to tailor specific treatments and prevention strategies for individual patients.

In essence, precision medicine is able to classify patients into sub-populations that are likely to differ in their response to a specific treatment. Therapeutic interventions can then be concentrated on those who will benefit, sparing expense and often unpleasant side effects for those who will not.

Robotic Surgery

Currently, robotic surgical devices are simply instruments that can translate actions outside the patient to inside the patient, often working through incisions as small as 8mm. The benefits of this are clear in terms of minimally invasive surgery, and by allowing surgeons to conduct the operations in a relaxed and stress-free environment. At the moment the robot does not do anything without direct input, but with the increasing development of AI systems, it is likely that in 10 or 15 years, certain parts of an operation such as suturing may be performed automatically by a robot, albeit under close supervision.

What will new technology mean for healthcare?

It is fiendishly difficult to predict the impact of innovative technological advances on medical practice and patient care. However, the overall message is clear – improvements in front end technology will allow patients to have a greater responsibility for their own personal health and well-being. Increased access to medical practice through innovative and efficient mechanisms will allow doctors to focus their time on the patients identified as suffering from more serious illnesses. Highly trained AI systems can then complement the doctors’ prowess in identifying and diagnosing particular diseases. Finally, treatment options will be highly tailored to individual patients and their conditions, increasing the cost-effectiveness of treatment.

However, each of these technology developments comes with associated costs and challenges. Not least, new technology could fundamentally change the way that medical staff work, requiring new skills and mindsets to effectively transform medical care into a radically new approach.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Some of the biggest changes in the practice of medicine and healthcare over the past 70 years have resulted from improvements in the way diseases and illnesses can be diagnosed and studied. Innovative technologies now allow doctors to discover increasing amounts of detailed information about both the progression and treatment of disease, allowing new treatment options and care pathways.

The most significant developments which are likely to change the face of medicine over the next few decades include:

  • Enhanced self-management for patients and the elderly through technology support systems to empower understanding and control of conditions.
  • Improved patient access to health service infrastructure through utilisation of remote care and monitoring systems.
  • Further developments in medical imaging and the application of Artificial Intelligence systems to effectively analyse and diagnose conditions.
  • Precision medicine that can target medical interventions to specific sub-groups of patients based on genomic data.
  • Robotic surgical systems that can conduct exquisitely precise operations in difficult-to-reach anatomical areas without flagging or losing concentration.

Self-Management for Patients

Day-to-day physiological monitoring technology, driven particularly by the spread of a variety of consumer wearable devices with communication capabilities, has the ability to collect and integrate health information from a variety of sources, both medical and consumer-based. The next generation of wearables is likely to significantly blur the division between technology lifestyle accessory and medical device, as reliable non-invasive sensors for the measurement of blood pressure, blood sugar, body temperature, pulse rate, hydration level and many more become increasingly implemented within these devices. The provision and integration of these derived complex sets of data has the potential to provide valuable information, that enabling a holistic approach to healthcare. The US FDA is currently working closely with industry to facilitate the introduction and effective use of these more advanced devices.

Enhanced Patient Access

In the UK, the NHS has brought high-quality medical services to every citizen, but often at the cost of long waits for visits to the doctor when a patient is concerned about his health. The introduction of improved access systems, including video-conferencing facilities, electronic health records and AI-powered chatbots, promises to be a powerful and game-changing move. In particular, chatbots systems such as Babylon Health or Ada can provide a highly accessible medical triage procedure, which can alleviate the pressure on over-worked doctors in GP surgeries, and allow those doctors to focus on patients with more serious conditions. With increasing sophistication, these chatbots can potentially provide accurate diagnostic advice on common ailments without any human interaction or involvement. The key concern is, of course, ensuring that the algorithms operate with patient safety foremost, which requires fine tuning to balance between over-caution and under diagnosis.

Medical Imaging and Artificial Intelligence

Following admission to a hospital, a key element of modern medicine is the use of imaging systems for clinical diagnosis, and the main challenge for doctors is to interpret the complexity and dynamic changes of these images. Currently, most interpretations are performed by human experts, which can be time-consuming, expensive and suffer from human error due to visual fatigue. Recent advances in machine learning systems have demonstrated that computers can extract richer information from images, with a corresponding increase in reliability and accuracy. Eventually, Artificial Intelligence will be able to identify and extract novel features that are not discernible to human viewers, allowing enhanced capabilities for medical intervention. This will allow doctors to re-focus on their interaction with patients, which is often cited as the most valued aspect of medical intervention.

Precision Medicine

The current paradigm for medical treatment is changing through the development of powerful new tools for genome sequencing which allows scientists to understand how genes affect human health. Medical decisions can now take account of genetic information, allowing doctors to tailor specific treatments and prevention strategies for individual patients.
In essence, precision medicine is able to classify patients into sub-populations that are likely to differ in their response to a specific treatment. Therapeutic interventions can then be concentrated on those who will benefit, sparing expense and often unpleasant side effects for those who will not.

Robotic Surgery

Currently, robotic surgical devices are simply instruments that can translate actions outside the patient to inside the patient, often working through incisions as small as 8mm. The benefits of this are clear in terms of minimally invasive surgery, and by allowing surgeons to conduct the operations in a relaxed and stress-free environment. At the moment the robot does not do anything without direct input, but with the increasing development of AI systems, it is likely that in 10 or 15 years, certain parts of an operation such as suturing may be performed automatically by a robot, albeit under close supervision.

What will new technology mean for healthcare?

It is fiendishly difficult to predict the impact of innovative technological advances on medical practice and patient care. However, the overall message is clear – improvements in front end technology will allow patients to have a greater responsibility for their own personal health and well-being. Increased access to medical practice through innovative and efficient mechanisms will allow doctors to focus their time on the patients identified as suffering from more serious illnesses. Highly trained AI systems can then complement the doctors’ prowess in identifying and diagnosing particular diseases. Finally, treatment options will be highly tailored to individual patients and their conditions, increasing the cost-effectiveness of treatment.
However, each of these technology developments comes with associated costs and challenges. Not least, new technology could fundamentally change the way that medical staff work, requiring new skills and mindsets to effectively transform medical care into a radically new approach.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading

The New Science of Genetic Medicine

Nigel Whittle - Head of Medical & Healthcare

By: Nigel Whittle
Head of Medical & Healthcare

9th January 2019

Home » Blog

With surprisingly little fanfare, in October 2018 NHS England became the first health service in the world to routinely offer genetic medicine in the fight to treat cancer.

From that date, hospitals across England have been linked to specialist centres that can read, analyse and interpret DNA isolated from patients with cancer. Through this service, cancer patients can be screened for the existence of key mutations within their tumours that can indicate the best drugs for treatment or to point towards clinical trials of experimental therapies that may be beneficial.

The move marks a big step towardprecision medicine, which offers more effective therapies that are tailored to individual patients.

What is the science underpinning this move?

Firstly, a quick crash course in cancer biology:

  • Cells are the building blocks of every living organism.The instructions (or genes) that tell a cell how to develop and what to do are encoded in long linear molecules of DNA found in the nucleus of the cell.
  • These DNA molecules can be damaged over time or through exposure to chemicals or environmental changes. Cells become cancerous when specific changes in the DNA, called ‘driver mutations’, tell cells to grow faster and behave abnormally.
  • Many cancers form solid tumours, which are masses of tissue, while cancers of the blood, such as leukaemia, generally do not form solid tumours.
  • As these cancer cells multiply to form a tumour, selective pressure increases the number and type of harmful mutations found within the DNA.
  • The cells may acquire additional properties through mutation, such as malignancy which means that they can spread into, or invade nearby tissues. In addition, as these tumours grow, some cancer cells break off and travel to distant parts of the body and form new tumours far from the original site.

Accordingly, although every cell of a particular cancer is related to the same original “parent” cell, the mixture of cells within a tumour becomes increasingly complex. The idea that different kinds of cells make up one cancer is called “tumour heterogeneity”, and in practice means that every cancer is unique. So two people with, say, lung cancer who are the same age, height, weight, and ethnicity, and who have similar medical histories, will almost certainly have two very different cancers.

By the time a cancer tumour is 1cm in diameter, the millions of cells within it are very different from each other, and each cancer has its own genetic identity created by the DNA in its cells.

This, of course, makes the treatment of cancer incredibly difficult and explains why scientific breakthroughs in the understanding of cancer biology do not always lead to significant improvements in overall survival rates.

How will cancer treatment change?

Precision medicine is an approach to patient care that allows doctors to select the best treatments for patients based on a genetic understanding of their disease. The idea of precision medicine is not new, but recent advances in science and technology have allowed the ideas to be brought more fully into clinical use.

Normally, when a patient is diagnosed with cancer, he or she receives a standard treatment based on previous experience of treating that disease. But typically, different people respond to treatments differently, and until recently doctors didn’t know why. But now the understanding that the genetic changes within one person’s cancer may not occur in others with the same type of cancer has led to a better understanding of which treatments will be most effective.

At the simplest level, this understanding allows targeted therapy against cancer, in which drugs (quite often complex biological molecules) are used to target very specific genetic changes in cancer cells. For example, around 15–20% of malignant breast cancers contain cells with a higher than normal level of a protein called HER2 on their surface, which stimulates them to grow. When combined with a suitable test, it means that not only can the drug be given to those patients most likely to benefit, but also the drug, with its associated side effects, need not be given to patients who will not benefit from its use.

So genetic medicine has already transformed the treatment of some cancer patients. The advent of widespread genetic medicine within the NHS is likely to lead to significant benefits for cancer patients, including:

• The identification of patients who are most likely to benefit from particular cancer therapy.

• The avoidance of unnecessary treatments that are less likely to work for specific groups of patients.

• The development of novel therapies targeted at specific tumour cells or cellular pathways.

Not only will precision medicine allow the development of precise and effective treatment strategies for cancer patients whilst improving the overall quality of life, but it will also finally destroy the myth of ‘one size fits all’ cancer therapy.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save


With surprisingly little fanfare, in October 2018 NHS England became the first health service in the world to routinely offer genetic medicine in the fight to treat cancer.

From that date, hospitals across England have been linked to specialist centres that can read, analyse and interpret DNA isolated from patients with cancer. Through this service, cancer patients can be screened for the existence of key mutations within their tumours that can indicate the best drugs for treatment or to point towards clinical trials of experimental therapies that may be beneficial.

The move marks a big step towardprecision medicine, which offers more effective therapies that are tailored to individual patients.

What is the science underpinning this move?


Firstly, a quick crash course in cancer biology:

  • Cells are the building blocks of every living organism. The instructions (or genes) that tell a cell how to develop and what to do are encoded in long linear molecules of DNA found in the nucleus of the cell.
  • These DNA molecules can be damaged over time or through exposure to chemicals or environmental changes. Cells become cancerous when specific changes in the DNA, called ‘driver mutations’, tell cells to grow faster and behave abnormally.
  • Many cancers form solid tumours, which are masses of tissue, while cancers of the blood, such as leukaemia, generally do not form solid tumours.
  • As these cancer cells multiply to form a tumour, selective pressure increases the number and type of harmful mutations found within the DNA.
  • The cells may acquire additional properties through mutation, such as malignancy which means that they can spread into, or invade nearby tissues. In addition, as these tumours grow, some cancer cells break off and travel to distant parts of the body and form new tumours far from the original site.


Accordingly, although every cell of a particular cancer is related to the same original “parent” cell, the mixture of cells within a tumour becomes increasingly complex. The idea that different kinds of cells make up one cancer is called “tumour heterogeneity”, and in practice means that every cancer is unique. So two people with, say, lung cancer who are the same age, height, weight, and ethnicity, and who have similar medical histories, will almost certainly have two very different cancers.

By the time a cancer tumour is 1cm in diameter, the millions of cells within it are very different from each other, and each cancer has its own genetic identity created by the DNA in its cells.


This, of course, makes the treatment of cancer incredibly difficult and explains why scientific breakthroughs in the understanding of cancer biology do not always lead to significant improvements in overall survival rates.

How will cancer treatment change?

Precision medicine is an approach to patient care that allows doctors to select the best treatments for patients based on a genetic understanding of their disease. The idea of precision medicine is not new, but recent advances in science and technology have allowed the ideas to be brought more fully into clinical use.

Normally, when a patient is diagnosed with cancer, he or she receives a standard treatment based on previous experience of treating that disease. But typically, different people respond to treatments differently, and until recently doctors didn’t know why. But now the understanding that the genetic changes within one person’s cancer may not occur in others with the same type of cancer has led to a better understanding of which treatments will be most effective.

At the simplest level, this understanding allows targeted therapy against cancer, in which drugs (quite often complex biological molecules) are used to target very specific genetic changes in cancer cells. For example, around 15–20% of malignant breast cancers contain cells with a higher than normal level of a protein called HER2 on their surface, which stimulates them to grow. When combined with a suitable test, it means that not only can the drug be given to those patients most likely to benefit, but also the drug, with its associated side effects, need not be given to patients who will not benefit from its use.

So genetic medicine has already transformed the treatment of some cancer patients. The advent of widespread genetic medicine within the NHS is likely to lead to significant benefits for cancer patients, including:

• The identification of patients who are most likely to benefit from particular cancer therapy.

• The avoidance of unnecessary treatments that are less likely to work for specific groups of patients.

• The development of novel therapies targeted at specific tumour cells or cellular pathways.


Not only will precision medicine allow the development of precise and effective treatment strategies for cancer patients whilst improving the overall quality of life, but it will also finally destroy the myth of ‘one size fits all’ cancer therapy.

For an informative chat on how Plextek can assist with your Healthcare technology project, please contact Nigel at healthcare@plextek.com

Save

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading

Being Your User

Nicholas Hill - Chief Executive Officer

By: Nicholas Hill
Chief Executive Officer

19th December 2018

Home » Blog

One of the important steps in the Design Council’s recommendations for good design is called “Being Your Users” and is a “Method to put yourself into the position of your user.” Its purpose is “building an understanding of and empathy with the users of your product …” Approaching product design from this perspective is critical to ensuring that the features incorporated are actually beneficial to the user – as opposed to features that are of benefit to the manufacturer, for example, or “because we can” features that have no obvious benefit at all.

It’s clear that domestic appliances are becoming more sophisticated, a trend which is facilitated by the availability of low-cost sensors and processing power. This has some clear benefits, such as the availability of more energy- or water-efficient wash cycles for example. And if designers stay focused on providing something of value to the end user this is a trend to be welcomed.

In practice, I see examples of what looks rather like engineers wondering what else they can do with all this additional sensor data, rather than being driven by user need. One example is the growing size of the error codes table in the back of most appliance manuals. These may occasionally add value, but for the most part, I see them as reasons why the product you paid good money for is refusing to do the job it is supposed to.

Here’s an example: the “smart” washing machine that I own doesn’t like low water pressure. It has a number of error codes associated with this. What does it do if the mains pressure drops temporarily – e.g. if simultaneously a toilet is flushed and the kitchen tap is running? It stops dead, displays the error code and refuses to do anything else until you power off the machine at the wall socket, forcing you to start the wash cycle again from scratch. This gets even more annoying if you’d set the timer and come back to a half-washed load. In the days before “smart” appliances, a temporary pressure drop would have either simply caused the water to fill more slowly, or else the machine would pause until pressure returned.

In what way does this behaviour benefit the user? Clearly, it doesn’t, and a few moments thought from a design team that was focussed on user needs, “being your user”, would have resulted in a different requirement specification being handed to the engineering team. It’s a good example of what happens when you start implementing a solution without properly considering the problem you are trying to solve.

My “intelligent” dishwasher has a different but equally maddening feature: it doesn’t like soft water. Its designers have clearly put water saving above all else, and the machine relies on either hard water or very dirty plates to counteract the natural foaming of the detergent tablets. With soft water, if you try washing lightly soiled dishes on a quick wash cycle (as you might expect appropriate), the machine is unable to rinse off the detergent. About 20 minutes into the cycle it skips to the end and gives up, leaving you with foamy, unrinsed plates.

I say unable, when the machine is actually unwilling, as all that is required is the application of sufficient water to rinse off the detergent – which is what I, as a user, then have to do manually. Who is working for whom here? Once again the user’s needs have not been at the top of the designer’s agenda when the requirement specification was passed to the engineering team. A truly smart device would finish the job properly, using as much water as was needed, and possibly suggest using less detergent next time.

Unless designers get a better grip, keeping the end user experience on the agenda, I fear examples of this type of machine behaviour will proliferate. We will see our devices, appliances and perhaps vehicles develop an increasingly long list of reasons why they can’t (won’t) perform the function you bought them for – because they’re having a bad hair day today, which becomes your problem to solve.

All to a refrain of “I’m sorry Dave, I’m afraid I can’t do that.”

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

One of the important steps in the Design Council’s recommendations for good design is called “Being Your Users” and is a “Method to put yourself into the position of your user.” Its purpose is “building an understanding of and empathy with the users of your product …” Approaching product design from this perspective is critical to ensuring that the features incorporated are actually beneficial to the user – as opposed to features that are of benefit to the manufacturer, for example, or “because we can” features that have no obvious benefit at all.

It’s clear that domestic appliances are becoming more sophisticated, a trend which is facilitated by the availability of low-cost sensors and processing power. This has some clear benefits, such as the availability of more energy- or water-efficient wash cycles for example. And if designers stay focused on providing something of value to the end user this is a trend to be welcomed.

In practice, I see examples of what looks rather like engineers wondering what else they can do with all this additional sensor data, rather than being driven by user need. One example is the growing size of the error codes table in the back of most appliance manuals. These may occasionally add value, but for the most part, I see them as reasons why the product you paid good money for is refusing to do the job it is supposed to.

Here’s an example: the “smart” washing machine that I own doesn’t like low water pressure. It has a number of error codes associated with this. What does it do if the mains pressure drops temporarily – e.g. if simultaneously a toilet is flushed and the kitchen tap is running? It stops dead, displays the error code and refuses to do anything else until you power off the machine at the wall socket, forcing you to start the wash cycle again from scratch. This gets even more annoying if you’d set the timer and come back to a half-washed load. In the days before “smart” appliances, a temporary pressure drop would have either simply caused the water to fill more slowly, or else the machine would pause until pressure returned.

In what way does this behaviour benefit the user? Clearly, it doesn’t, and a few moments thought from a design team that was focussed on user needs, “being your user”, would have resulted in a different requirement specification being handed to the engineering team. It’s a good example of what happens when you start implementing a solution without properly considering the problem you are trying to solve.

My “intelligent” dishwasher has a different but equally maddening feature: it doesn’t like soft water. Its designers have clearly put water saving above all else, and the machine relies on either hard water or very dirty plates to counteract the natural foaming of the detergent tablets. With soft water, if you try washing lightly soiled dishes on a quick wash cycle (as you might expect appropriate), the machine is unable to rinse off the detergent. About 20 minutes into the cycle it skips to the end and gives up, leaving you with foamy, unrinsed plates.

I say unable, when the machine is actually unwilling, as all that is required is the application of sufficient water to rinse off the detergent – which is what I, as a user, then have to do manually. Who is working for whom here? Once again the user’s needs have not been at the top of the designer’s agenda when the requirement specification was passed to the engineering team. A truly smart device would finish the job properly, using as much water as was needed, and possibly suggest using less detergent next time.

Unless designers get a better grip, keeping the end user experience on the agenda, I fear examples of this type of machine behaviour will proliferate. We will see our devices, appliances and perhaps vehicles develop an increasingly long list of reasons why they can’t (won’t) perform the function you bought them for – because they’re having a bad hair day today, which becomes your problem to solve.

All to a refrain of “I’m sorry Dave, I’m afraid I can’t do that.”

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading