50 ways to leave your lover

50 Ways to Leave Your Lover…. And to Solve a Problem

Stephen Field - Lead Consultant, Product Design

By: Stephen Field
Lead Consultant, Product Design

15th November 2017

Home » Innovation » Page 2

There are ‘50 Ways to Leave Your Lover’ according to singer Paul Simon in his hit song from 1975. Looking back, he only managed to recommend five, quite vague, ways of extricating yourself from a physical relationship. I suppose you’d call that false advertising, or perhaps, over-promising and under-delivering.

Of course, Paul is only tapping into his own personal experiences and knowledge. There are likely more than fifty ways to leave one’s lover and, with no pretence of a subtle transition, I move on to compare the themes of this song to the role of the engineering designer.

In terms of mechanical design, there are usually many ways to solve a design problem. In fact, you’ll be able to find parallels here that relate to any design or engineering specialism. While there are numerous solutions to a problem, no one person, no matter how clever or experienced, can have sufficient insight to see all the possible approaches. Each of us has a limitation on what training and experience we have received and what life has exposed us to. This knowledge informs how we react to design challenges; how we approach them and the courses of action we inherently pursue.

Collaboratively, we can accomplish so much more. The adage ‘two heads are better than one’ is completely true in this scenario. Working together to solve problems enables a much wider experience to fertilise the problem-solving egg. When done at the right time, this can be amplified by the use of brainstorming techniques.

Brainstorming is most effective when the team is comprised of a diverse range of backgrounds. An individual’s engineering experiences ensures each member brings something unique to the creative process and we’re able to visualise many possible solutions as a result. Brainstorming helps us avoid simply opting for the first solution that comes to mind, which may not be the best. Furthermore, working in a team also helps the creative process. One idea sparks another, and so on.

A diverse team with an open dialogue approach to the creative process allows for the introduction of different ways of seeing the world and the challenges we face. These varied viewpoints help to steer the design to the best solution. The person who can see the holes in another’s plan has just as an important contribution as the one making the suggestion.

The moral of the story; during the creative phase of the design process – share and collaborate with your colleagues. Almost certainly they will have a perspective you’ve not seen. When faced with a mental block over your design, share and welcome different and even opposing views from a supportive team. Make a new plan, Stan.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

There are ‘50 Ways to Leave Your Lover’ according to singer Paul Simon in his hit song from 1975. Looking back, he only managed to recommend five, quite vague, ways of extricating yourself from a physical relationship. I suppose you’d call that false advertising, or perhaps, over-promising and under-delivering.

Of course, Paul is only tapping into his own personal experiences and knowledge. There are likely more than fifty ways to leave one’s lover and, with no pretence of a subtle transition, I move on to compare the themes of this song to the role of the engineering designer.

In terms of mechanical design, there are usually many ways to solve a design problem. In fact, you’ll be able to find parallels here that relate to any design or engineering specialism. While there are numerous solutions to a problem, no one person, no matter how clever or experienced, can have sufficient insight to see all the possible approaches. Each of us has a limitation on what training and experience we have received and what life has exposed us to. This knowledge informs how we react to design challenges; how we approach them and the courses of action we inherently pursue.

Collaboratively, we can accomplish so much more. The adage ‘two heads are better than one’ is completely true in this scenario. Working together to solve problems enables a much wider experience to fertilise the problem-solving egg. When done at the right time, this can be amplified by the use of brainstorming techniques.

Brainstorming is most effective when the team is comprised of a diverse range of backgrounds. An individual’s engineering experiences ensures each member brings something unique to the creative process and we’re able to visualise many possible solutions as a result. Brainstorming helps us avoid simply opting for the first solution that comes to mind, which may not be the best. Furthermore, working in a team also helps the creative process. One idea sparks another, and so on.

A diverse team with an open dialogue approach to the creative process allows for the introduction of different ways of seeing the world and the challenges we face. These varied viewpoints help to steer the design to the best solution. The person who can see the holes in another’s plan has just as an important contribution as the one making the suggestion.

The moral of the story; during the creative phase of the design process – share and collaborate with your colleagues. Almost certainly they will have a perspective you’ve not seen. When faced with a mental block over your design, share and welcome different and even opposing views from a supportive team. Make a new plan, Stan.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading

Innovation… What Does It Mean?

Stewart Da'Silva - Senior Designer, Product Design

By: Stewart Da’Silva
Senior Designer, Product Design

8th November 2017

Home » Innovation » Page 2

The buzz word of the moment that is constantly being bandied about is ‘innovation’. There is hardly a departmental or company briefing where that word isn’t mentioned.

Indeed, it seems to be held up in the business world as the holy grail of survival; a panacea against the risk of extinction (in the corporate sense). Market gurus metaphorically stand on tip-toes whilst balancing on rooftops shouting through megaphones…”INNOVATE OR DIE!”

But what exactly does ‘innovation’ mean? What does it mean to us as individuals and as a company?

My perception of ‘innovation’ is that it isn’t something that I, personally, should bother my pretty little head about. After all, I know for certain that having spent my whole working life immersed in the world of engineering… I have never once in all those many, many years had a spark of an original idea that has ever taken seed and germinated in the wilderness that is my brain.

No, I had assumed that this call for us to innovate was directed towards the more intelligent amongst us and that they were being asked to dream up some new ground-breaking idea… a blinding flash of inspiration that our company could exploit in the form of some great new product.

Then the realisation began to dawn; that there, in fact, had been very few real inventions of any substance for many years.

A case in point is in our own industry – electronics.

It is accepted that the transistor was the starting point of the phenomenal growth of the electronics industry as we know it today. The ‘invention’ of the transistor took place in the Bell Laboratories in 1947 by John Barbeen and Walter Brattain, in fact, they, together with William Shockley, received the 1956 Nobel Prize in Physics for “their researches on semiconductors and their discovery of the transistor effect.”

Except… they didn’t ‘discover’ the transistor effect.

It was, in fact, described by one Julius Lilienfeld in a patent that he filed in Canada on the ‘field effect transistor’ in 1925. Although he patented it – he published no known research articles on the subject. Bell scientists Bardeen and Brattain, in fact, built a field effect transistor utilising Lilienfeld’s patent in their research laboratory and surprisingly it worked, they then set about improving and refining the efficiency of the device and then published their findings – although Lilienfeld’s patent was the basis for their transistor, he was never credited in their published papers.

But then Lilienfeld himself had built upon research and observations that had gone before.

In 1833, Faraday’s research on the negative temperature coefficient of resistance of silver sulphide was the first recorded observation of any semiconductor property. The trail from Faraday’s experiments to the Lilienfeld patent had many, many contributors.

My point?

Nanos gigantum humeris insidentes’ – discovering truth by building on previous discoveries.

The first working transistor wasn’t invented in 1947, it evolved from Faraday’s first observations in 1833. At that time, that is all it was, an observation – with no obvious applications.

This meandering pathway had then progressed towards its conclusion (the transistor) in a succession of incremental steps. Academics and scientists didn’t carry on their given research in splendid isolation from those that went before. If they found some relevance to their own research then they applied those previous observations and investigations to further their own knowledge and that of those that were to follow.

Which brings me back to where I started – ‘Innovation… what does it mean?’

In today’s engineering environment, I believe that it means that we, each and every one of us, could be an innovator. We don’t have to be qualified in a specific field. We just need to be open and have the vision to see how established techniques in the world around us could be transferred and applied to other disciplines to create or improve an existing product: cross-pollination of ideas and skills. Indeed, in the first instance, there is no need for detail… just the vision.

I believe that each and every one of us is capable of doing that.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

The buzz word of the moment that is constantly being bandied about is ‘innovation’. There is hardly a departmental or company briefing where that word isn’t mentioned.

Indeed, it seems to be held up in the business world as the holy grail of survival; a panacea against the risk of extinction (in the corporate sense). Market gurus metaphorically stand on tip-toes whilst balancing on rooftops shouting through megaphones…”INNOVATE OR DIE!”

But what exactly does ‘innovation’ mean? What does it mean to us as individuals and as a company?

My perception of ‘innovation’ is that it isn’t something that I, personally, should bother my pretty little head about. After all, I know for certain that having spent my whole working life immersed in the world of engineering… I have never once in all those many, many years had a spark of an original idea that has ever taken seed and germinated in the wilderness that is my brain.

No, I had assumed that this call for us to innovate was directed towards the more intelligent amongst us and that they were being asked to dream up some new ground-breaking idea… a blinding flash of inspiration that our company could exploit in the form of some great new product.

Then the realisation began to dawn; that there, in fact, had been very few real inventions of any substance for many years.

A case in point is in our own industry – electronics.

It is accepted that the transistor was the starting point of the phenomenal growth of the electronics industry as we know it today. The ‘invention’ of the transistor took place in the Bell Laboratories in 1947 by John Barbeen and Walter Brattain, in fact, they, together with William Shockley, received the 1956 Nobel Prize in Physics for “their researches on semiconductors and their discovery of the transistor effect.”

Except… they didn’t ‘discover’ the transistor effect.

It was, in fact, described by one Julius Lilienfeld in a patent that he filed in Canada on the ‘field effect transistor’ in 1925. Although he patented it – he published no known research articles on the subject. Bell scientists Bardeen and Brattain, in fact, built a field effect transistor utilising Lilienfeld’s patent in their research laboratory and surprisingly it worked, they then set about improving and refining the efficiency of the device and then published their findings – although Lilienfeld’s patent was the basis for their transistor, he was never credited in their published papers.

But then Lilienfeld himself had built upon research and observations that had gone before.

In 1833, Faraday’s research on the negative temperature coefficient of resistance of silver sulphide was the first recorded observation of any semiconductor property. The trail from Faraday’s experiments to the Lilienfeld patent had many, many contributors.

My point?

Nanos gigantum humeris insidentes’ – discovering truth by building on previous discoveries.

The first working transistor wasn’t invented in 1947, it evolved from Faraday’s first observations in 1833. At that time, that is all it was, an observation – with no obvious applications.

This meandering pathway had then progressed towards its conclusion (the transistor) in a succession of incremental steps. Academics and scientists didn’t carry on their given research in splendid isolation from those that went before. If they found some relevance to their own research then they applied those previous observations and investigations to further their own knowledge and that of those that were to follow.

Which brings me back to where I started – ‘Innovation… what does it mean?’

In today’s engineering environment, I believe that it means that we, each and every one of us, could be an innovator. We don’t have to be qualified in a specific field. We just need to be open and have the vision to see how established techniques in the world around us could be transferred and applied to other disciplines to create or improve an existing product: cross-pollination of ideas and skills. Indeed, in the first instance, there is no need for detail… just the vision.

I believe that each and every one of us is capable of doing that.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading

The New Shamans?

Nicholas Hill - Chief Executive Officer

By: Nicholas Hill
Chief Executive Officer

27th September 2017

Home » Innovation » Page 2

We can view the engineer today as someone who has special powers to control or influence the spirits of technology. A guardian of the magic of electronics and software; someone who can be looked to for a vision of what can be achieved if the spirits are willing, and someone in whom we trust that the spirits will be successfully harnessed and the vision delivered (on time and to budget).

I started driving at 17 in an old Mini that was built in 1964, and, as such, was fairly basic in engineering terms. To give a flavour of how basic it was, the windscreen wipers did not self-park (they stopped dead the moment the switch was turned off). There was also no starter solenoid, so you activated the starter motor by pressing a heavy duty push switch mounted next to the handbrake.

Being both an aspiring engineer and a rally competitor, I was soon pulling the car apart to make performance improvements, in the process finding out how all the mechanical and electrical parts and systems operated. The simple engineering was a blessing at that point as it was fairly self-evident how most things worked once you had them stripped down to the core components, whether it was the engine, suspension, brakes or electrics.

Over the years since I started driving, the change in technology has been astonishing. The change seems more remarkable now, looking back over a few decades than it did as I experienced it evolving. Layers upon layers of complexity have been added to everyday objects from the motor car to the telephone: first in electronics, then in software, and most recently in artificial intelligence.

There was a period of time when using the new technology was just too unfriendly for many people – remember all the complaints about not being able to program a video recorder? I think we reached a tipping point where user interface design had moved on far enough to make devices intuitive to use, and since then the general public can’t get enough of what technology can deliver. The trajectory now for the public at large seems to be an ever growing hunger for the benefits of new technology, in tandem with an ever-diminishing idea, or concept of, how any of the technology works.

Going back to the starter motor on my Mini, you could quite quickly become confident of explaining its operation without any difficulty to a layperson without any specialist engineering knowledge. A fat wire connected the battery to the electric starter motor via the big switch that I referred to. That was it. There were only really four concepts to grasp – the electrochemistry in the battery, electrical conductors and switches, and enough electromagnetics to explain how a motor produced torque. Indeed, you could make a small scale demonstrator using a torch battery and a small motor and then pull them apart to aid explanation.

This got me wondering about how I would go about explaining to a similar person how my present car gets started. At the highest level, it is easy enough: when I press the button on the dashboard, the car checks that the appropriate key fob is actually somewhere in the vehicle and then tells the engine to start; the starter motor runs for as long as it takes to get the engine going and then turns off. It soon gets harder as you get into describing a collection of sensors, radio links, processing units, communications buses, and power electronics and so on. If you want to get beyond mystery black boxes it gets much harder still.

To tackle just one element, say the wireless key fob, in any depth, would require an explanation of its physical radio link to the vehicle, the communications protocol, and the authentication methods. Dealing with any of these elements is going to require a discussion of great breadth and depth. And then, of course, there’s the issue of how all the electronic components involved actually work. I concluded that it was probably beyond me, technology had just moved on too far.

Artificial intelligence is only going to make comprehension harder, as behaviour becomes less deterministic. Until recently, a device manufacturer could describe to a user exactly how a device will behave in a given set of circumstances, albeit the explanation might be complicated. Going forward, you might find that your next generation robotic vacuum cleaner has behaviour that is unique to you because the environment in which it self-learned its behaviour is unique to you. The manufacturer you now are calling for an explanation may never have seen the behaviour you describe, but the device may actually be behaving perfectly ‘normally’.

What does this mean for the engineer whose job it is to provide the technology under the hood to companies who wish to bring new and evolving products to market? Where the end user knows (and perhaps cares) less and less about how it all works. This puts the engineer in a position of considerable power and influence and needs to use that power both wisely and ethically.

I will be exploring this topic in Part 2 of this blog.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

We can view the engineer today as someone who has special powers to control or influence the spirits of technology. A guardian of the magic of electronics and software; someone who can be looked to for a vision of what can be achieved if the spirits are willing, and someone in whom we trust that the spirits will be successfully harnessed and the vision delivered (on time and to budget).

I started driving at 17 in an old Mini that was built in 1964, and, as such, was fairly basic in engineering terms. To give a flavour of how basic it was, the windscreen wipers did not self-park (they stopped dead the moment the switch was turned off). There was also no starter solenoid, so you activated the starter motor by pressing a heavy duty push switch mounted next to the handbrake.

Being both an aspiring engineer and a rally competitor, I was soon pulling the car apart to make performance improvements, in the process finding out how all the mechanical and electrical parts and systems operated. The simple engineering was a blessing at that point as it was fairly self-evident how most things worked once you had them stripped down to the core components, whether it was the engine, suspension, brakes or electrics.

Over the years since I started driving, the change in technology has been astonishing. The change seems more remarkable now, looking back over a few decades than it did as I experienced it evolving. Layers upon layers of complexity have been added to everyday objects from the motor car to the telephone: first in electronics, then in software, and most recently in artificial intelligence.

There was a period of time when using the new technology was just too unfriendly for many people – remember all the complaints about not being able to program a video recorder? I think we reached a tipping point where user interface design had moved on far enough to make devices intuitive to use, and since then the general public can’t get enough of what technology can deliver. The trajectory now for the public at large seems to be an ever growing hunger for the benefits of new technology, in tandem with an ever-diminishing idea, or concept of, how any of the technology works.

Going back to the starter motor on my Mini, you could quite quickly become confident of explaining its operation without any difficulty to a layperson without any specialist engineering knowledge. A fat wire connected the battery to the electric starter motor via the big switch that I referred to. That was it. There were only really four concepts to grasp – the electrochemistry in the battery, electrical conductors and switches, and enough electromagnetics to explain how a motor produced torque. Indeed, you could make a small scale demonstrator using a torch battery and a small motor and then pull them apart to aid explanation.

This got me wondering about how I would go about explaining to a similar person how my present car gets started. At the highest level, it is easy enough: when I press the button on the dashboard, the car checks that the appropriate key fob is actually somewhere in the vehicle and then tells the engine to start; the starter motor runs for as long as it takes to get the engine going and then turns off. It soon gets harder as you get into describing a collection of sensors, radio links, processing units, communications buses, and power electronics and so on. If you want to get beyond mystery black boxes it gets much harder still.

To tackle just one element, say the wireless key fob, in any depth, would require an explanation of its physical radio link to the vehicle, the communications protocol, and the authentication methods. Dealing with any of these elements is going to require a discussion of great breadth and depth. And then, of course, there’s the issue of how all the electronic components involved actually work. I concluded that it was probably beyond me, technology had just moved on too far.

Artificial intelligence is only going to make comprehension harder, as behaviour becomes less deterministic. Until recently, a device manufacturer could describe to a user exactly how a device will behave in a given set of circumstances, albeit the explanation might be complicated. Going forward, you might find that your next generation robotic vacuum cleaner has behaviour that is unique to you because the environment in which it self-learned its behaviour is unique to you. The manufacturer you now are calling for an explanation may never have seen the behaviour you describe, but the device may actually be behaving perfectly ‘normally’.

What does this mean for the engineer whose job it is to provide the technology under the hood to companies who wish to bring new and evolving products to market? Where the end user knows (and perhaps cares) less and less about how it all works. This puts the engineer in a position of considerable power and influence and needs to use that power both wisely and ethically.

I will be exploring this topic in Part 2 of this blog.

Save

Save

Save

Save

Save

Save

Save

Save

Further Reading

Save

Sensing Auditory Evoked Potentials

Protecting Against Tinnitus With Big Data

Thomas Rouse - Senior Consultant, Medical & Healthcare

By: Thomas Rouse
Senior Consultant, Medical & Healthcare

30th August 2017

Home » Innovation » Page 2

We are being continuously monitored in our daily lives; from search engines tracking browsing habits, shops analysing purchases via loyalty cards or online accounts, and social media targeting adverts based upon our friends, conversations, and activities. While we may accept this as the cost of entry to the modern world, few would deny that it is evocative of a dystopian, Big Brotheresque hierarchy, where the monitoring is unlikely to be for our benefit. Health monitoring may be a nobler goal, however even a seemingly altruistic project, DeepMind’s collaboration with the Royal Free London NHS Foundation Trust to reduce preventable deaths from acute kidney incidents, has fallen foul of the public perception and the Information Commissioner’s Office.

There is a lot of excitement about data-driven health innovation, especially where the data can be collected automatically, and potentially uploaded or aggregated. This could allow for improved outcomes, more accurate diagnoses, early warning of conditions, advanced recovery monitoring, fewer hospital visits, and ultimately revolutionise the understanding and treatment of many diseases and conditions.

We have developed a wonderful technology which is able to automatically provide detailed characterisation of a user’s auditory system by detecting electrical signals from the cochlea and auditory brainstem. No more user interaction is required other than putting on a set of headphones, and no clinical supervision is necessary. It can detect permanent or temporary changes, and, with regular use at home or work, provide early warning of the onset of hearing loss and tinnitus.

The applications are driven by who wants to use the data, and why. The technology was originally developed to allow employers more cost effectively and conveniently meet their health surveillance duty, under the ‘The Control of Noise at Work Regulations. Workers with high levels of noise exposure need to have regular audiometric tests. It may be disruptive or impractical to send staff to a testing centre. There are also requirements after the tests have taken place. The purpose of the regulations is to protect workers, and if an issue is detected, action should be taken to prevent further damage.

The employer must also keep health records of the outcome of the surveillance; however, these cannot contain any confidential medical records. With our technology, these tasks can be automated without any need to leave the workplace. Beyond compliance, there is also a potential upside for the employer if testing can be made before and after each shift. In a case of litigation relating to hearing loss, it is likely that it could be shown whether the damage occurred inside or outside of work hours.

Individuals may be concerned about their own or their loved one’s hearing. Building the technology into consumer headphones was also one of the original motivations. For example, a smartphone app would be able to take regular snapshots of a user’s hearing and alert them or a parent if there is any change, long before symptoms become apparent.

Medical trials of drugs which may have a side effect of tinnitus would be able to use the technology to objectively monitor and record the state of the auditory system instead of having to rely on the subject’s subjective assessment.

Perhaps most interestingly of all, the data could allow the technology itself to improve. This double-edged information sword needs to be handled carefully. It is essential that no-one feels their data has been abused, so this needs to be balanced with the potentially significant benefits. The signal we record was previously only obtainable by an expert practitioner in a clinic, so comparative studies over time are limited. Long term data from a large number of subjects is likely to improve system performance and the wider understanding. It may shed light upon insidious conditions, such as hidden hearing loss and tinnitus, and provide a vital additional tool for audiologists as part of an integrated healthcare system.

Save

Save

Save

Save

Save

Save

Save

We are being continuously monitored in our daily lives; from search engines tracking browsing habits, shops analysing purchases via loyalty cards or online accounts, and social media targeting adverts based upon our friends, conversations, and activities. While we may accept this as the cost of entry to the modern world, few would deny that it is evocative of a dystopian, Big Brotheresque hierarchy, where the monitoring is unlikely to be for our benefit. Health monitoring may be a nobler goal, however even a seemingly altruistic project, DeepMind’s collaboration with the Royal Free London NHS Foundation Trust to reduce preventable deaths from acute kidney incidents, has fallen foul of the public perception and the Information Commissioner’s Office.

There is a lot of excitement about data-driven health innovation, especially where the data can be collected automatically, and potentially uploaded or aggregated. This could allow for improved outcomes, more accurate diagnoses, early warning of conditions, advanced recovery monitoring, fewer hospital visits, and ultimately revolutionise the understanding and treatment of many diseases and conditions.

We have developed a wonderful technology which is able to automatically provide detailed characterisation of a user’s auditory system by detecting electrical signals from the cochlea and auditory brainstem. No more user interaction is required other than putting on a set of headphones, and no clinical supervision is necessary. It can detect permanent or temporary changes, and, with regular use at home or work, provide early warning of the onset of hearing loss and tinnitus.

The applications are driven by who wants to use the data, and why. The technology was originally developed to allow employers more cost effectively and conveniently meet their health surveillance duty, under the ‘The Control of Noise at Work Regulations. Workers with high levels of noise exposure need to have regular audiometric tests. It may be disruptive or impractical to send staff to a testing centre. There are also requirements after the tests have taken place. The purpose of the regulations is to protect workers, and if an issue is detected, action should be taken to prevent further damage.

The employer must also keep health records of the outcome of the surveillance; however, these cannot contain any confidential medical records. With our technology, these tasks can be automated without any need to leave the workplace. Beyond compliance, there is also a potential upside for the employer if testing can be made before and after each shift. In a case of litigation relating to hearing loss, it is likely that it could be shown whether the damage occurred inside or outside of work hours.

Individuals may be concerned about their own or their loved one’s hearing. Building the technology into consumer headphones was also one of the original motivations. For example, a smartphone app would be able to take regular snapshots of a user’s hearing and alert them or a parent if there is any change, long before symptoms become apparent.

Medical trials of drugs which may have a side effect of tinnitus would be able to use the technology to objectively monitor and record the state of the auditory system instead of having to rely on the subject’s subjective assessment.

Perhaps most interestingly of all, the data could allow the technology itself to improve. This double-edged information sword needs to be handled carefully. It is essential that no-one feels their data has been abused, so this needs to be balanced with the potentially significant benefits. The signal we record was previously only obtainable by an expert practitioner in a clinic, so comparative studies over time are limited. Long term data from a large number of subjects is likely to improve system performance and the wider understanding. It may shed light upon insidious conditions, such as hidden hearing loss and tinnitus, and provide a vital additional tool for audiologists as part of an integrated healthcare system.

Save

Save

Save

Save

Save

Save

Save

Further Reading

The Freerunners of Wearables

The Freerunners of Wearables

Stewart Da'Silva - Senior Designer, Product Design

By: Stewart Da’Silva
Senior Designer, Product Design

19th July 2017

Home » Innovation » Page 2

Is the push for new technologies, such as wearables, and the immersive technology of Virtual Reality (VR), a means of us short-cutting natural evolution? Is this the start of a journey where we will enhance our natural abilities with various wearables marketed as specific desirable traits?

Rubbish! I can hear you say – but consider this.

For a millennium, Mother Nature has advanced our capabilities in her own slow, haphazard way – it takes generations for what we would perceive as “needed developments” to take place within our bodies. We are impatient and technology serves as a way of supplementing the limitations that nature has imposed upon us.

Globally, we have already travelled quite a way down that road.

Generally, we no longer walk anywhere – vehicles have become our legs. Our intellect, in various degrees of reliance, has become our smartphone or tablet. Who needs a good memory when we can depend on an internet search engine? Imagination? Why not just slip on your VR headset and become immersed within an installed experience of your choice.

For all the examples above, wearables are rapidly becoming mandatory in order for us to function. They are now becoming more personal – monitoring our pulse, our blood pressure, how active we are and other indicators of health and wellbeing.

What of the future? In some quarters, wearables have moved into semi-permanent implants, from mini defibrillators that monitor the heart for abnormal rhythms and correct them, through to implantable Radio Frequency Identifiers (RFIDs).

Kevin Warwick, the Professor of Cybernetics at the University of Reading, was the first person to have a RFID implanted into his own hand. Inserted by a trained medical staff member, the implant enables him to use card reading doors and to control lighting. Another one of his firsts was to have a device implanted into his median nerves that linked his nervous system directly to a computer, programming a robotic arm to exactly mimic his own arm movements.

Now we come to ‘freerunning’ wearables.

There are people out there that are experimenting with ‘off-grid’ wearable implants to heighten and augment sensations within and outside their own bodies. They call themselves grinders, biohackers or body hackers and many of them manufacture and insert the wearable implants themselves. The favoured wearable for biohackers is a magnet that is implanted into various parts on the body, usually on the end of a finger.

These wearable magnets can detect magnetic fields emanating from various sources. Microwave ovens or power lines, for example, cause the implanted magnets to vibrate against the adjacent nerves – giving the grinder a ‘sixth sense’. It also means that they can attract light ferrous based objects without physical touching them… magic!

Grinders aren’t only about ‘sixth sense’ stuff – some have a personal objective to enhance or correct what nature has dealt them. Some of these wearable implants, like Warwick’s experiments, could in the near future prove mainstream.

For example, in conjunction with the wearable implanted magnets, the ‘Bottlenose’ device can be slid over the ‘magnet finger’. Bottlenose, a project by open source biotechnology company Grindhouse Wetware, mimics the sonar echolocation that the dolphin of the same name uses. This device sends out electromagnetic pulses and the implanted magnets are extremely sensitive to the returning waves. Vibration intensity also increases the closer the subject is to the obstacle. With practice, a mental picture can be formed of the shape and distance of surrounding objects.

The implications are obvious, people with sight loss could ‘see’ again using a refined version of Bottlenose. However, what if the whole wearable device could be miniaturised? Perhaps it could become a mainstream wearable implant in its own right?

Save

Save

Save

Save

Save

Save

Is the push for new technologies, such as wearables, and the immersive technology of Virtual Reality (VR), a means of us short-cutting natural evolution? Is this the start of a journey where we will enhance our natural abilities with various wearables marketed as specific desirable traits?

Rubbish! I can hear you say – but consider this.

For a millennium, Mother Nature has advanced our capabilities in her own slow, haphazard way – it takes generations for what we would perceive as “needed developments” to take place within our bodies. We are impatient and technology serves as a way of supplementing the limitations that nature has imposed upon us.

Globally, we have already travelled quite a way down that road.

Generally, we no longer walk anywhere – vehicles have become our legs. Our intellect, in various degrees of reliance, has become our smartphone or tablet. Who needs a good memory when we can depend on an internet search engine? Imagination? Why not just slip on your VR headset and become immersed within an installed experience of your choice.

For all the examples above, wearables are rapidly becoming mandatory in order for us to function. They are now becoming more personal – monitoring our pulse, our blood pressure, how active we are and other indicators of health and wellbeing.

What of the future? In some quarters, wearables have moved into semi-permanent implants, from mini defibrillators that monitor the heart for abnormal rhythms and correct them, through to implantable Radio Frequency Identifiers (RFIDs).

Kevin Warwick, the Professor of Cybernetics at the University of Reading, was the first person to have a RFID implanted into his own hand. Inserted by a trained medical staff member, the implant enables him to use card reading doors and to control lighting. Another one of his firsts was to have a device implanted into his median nerves that linked his nervous system directly to a computer, programming a robotic arm to exactly mimic his own arm movements.

Now we come to ‘freerunning’ wearables.

There are people out there that are experimenting with ‘off-grid’ wearable implants to heighten and augment sensations within and outside their own bodies. They call themselves grinders, biohackers or body hackers and many of them manufacture and insert the wearable implants themselves. The favoured wearable for biohackers is a magnet that is implanted into various parts on the body, usually on the end of a finger.

These wearable magnets can detect magnetic fields emanating from various sources. Microwave ovens or power lines, for example, cause the implanted magnets to vibrate against the adjacent nerves – giving the grinder a ‘sixth sense’. It also means that they can attract light ferrous based objects without physical touching them… magic!

Grinders aren’t only about ‘sixth sense’ stuff – some have a personal objective to enhance or correct what nature has dealt them. Some of these wearable implants, like Warwick’s experiments, could in the near future prove mainstream.

For example, in conjunction with the wearable implanted magnets, the ‘Bottlenose’ device can be slid over the ‘magnet finger’. Bottlenose, a project by open source biotechnology company Grindhouse Wetware, mimics the sonar echolocation that the dolphin of the same name uses. This device sends out electromagnetic pulses and the implanted magnets are extremely sensitive to the returning waves. Vibration intensity also increases the closer the subject is to the obstacle. With practice, a mental picture can be formed of the shape and distance of surrounding objects.

The implications are obvious, people with sight loss could ‘see’ again using a refined version of Bottlenose. However, what if the whole wearable device could be miniaturised? Perhaps it could become a mainstream wearable implant in its own right?

Save

Save

Save

Save

Save

Save

Further Reading