Whilst most pundits are focusing on AI, VR, AR and Machine Learning, it is clear that the Digital Frontier has moved on to Human Integrated Computing (HIC). Though not everyone uses the same terminology, so you may see Human Computer Interface, Brain Computer Integration and a number of similar terms to describe basically the same thing.
So this year has seen the IEEE's 6th International Conference on Brain Computer Interfaces (BCIs) and the Hackaday Competition for the Greatest Human Computer Integration (with a prize of US$50,000). Clear themes from both events were the avoidance of direct implants and using any number of techniques to communicate between people and computer devices.
A lot of these are based upon reading brain waves or nerve impulses and there is a strong theme of applications around things which help people recover from debilitating accidents or deal with disability. However it is clear that recent advances in the ease of applying machine learning to solutions has been a clear game changer in interpreting brain waves and nerve impulse signals. The ability of machine learning to recognise how to filter out extraneous signals and focus on what matters to the application is key to this.
Additionally, in a wider social context there is a fringe trend towards people choosing to "self adapt" with home grown technology implants into their bodies. This trend, sometimes called Trans-humanism, involves implanting anything from magnets, through chips to specific devices into their bodies to provide a variety of single function applications. Although Lukas Zpira refers to it as "Body Hacktivism" and espouses a creed of "taking control of our destinies by continuously reinventing the self".
So what does this mean in terms of realisation in practical day-to-day life? Well there are a surprising number of products which are either ready for market or close to release, as well as any number of technical concept demonstrators. These include:
Prosthetic Limbs,
Exoskeleton Devices,
Turning Thought into Speech, see Nuros's Nuos software
Eyeball Tracking,
Remote Controlled Limbs, e.g. CTRL Labs Wrist Band,
Additional Limbs, e.g. a second pair of arms on a backpack - Keio University's Fusion
Accessing Human Memory,
Improved Physiological Measurement of things such as Blood Pressure,
Detection of Emotions.
Though, to my mind one of the more interesting things is the research being conducted by
researchers at Drexel and ISAE-SUPAERO into aircraft pilot'scognition during extreme incidents and how they deal with the sensory overload
of multiple sound alarms, flashing indicators and situational awareness when an
accident occurs during flight. Their research involves attempting to monitor
how they deal with such incidents, using functional Near Infra Red Spectroscopy
(fNIRS) to quantify brain activity response in the Anterior Prefrontal Cortex.
So far they have demonstrated its feasibility and the fact that in real life,
the overload is higher than in a simulator and pilots make more mistakes. In
the future it should be possible to use this to assist in optimising
instrumentation design, reduce cognitive overload and the likelihood of errors.
All this represents practicable and achievable goals in the
evolution of the path towards the dreams of Elon Musk and Mark Zuckerberg who
are pursuing full embedding of computers into the brain with their Neuralink
and Building 8 programmes. But as I mentioned in a previous posting there are
immense issues around not just technical practicalities, but ethics, security,
phsychology, dealing with potential information overload and long term upgrade
capability to be addressed before these goals become safe, let alone desirable.
No comments:
Post a Comment