Human-Computer Interface (HCI) Tech Digest - June 2017

Chatbot to help depression and anxiety

Woebot, from US-based Woebot Labs, is a chatbot available on Facebook Messenger that uses cognitive behavioural therapy techniques to help people through tough times. The chatbot can monitor a user’s moods based on conversation history, analyse patterns to find useful advice for the user, be there 24/7, learn over time to better interact, and help people feel better. The last claim is based on a Stanford study of Woebot’s effectiveness at helping young people deal with depression and anxiety, which showed that out of a group of seventy 18-28-year-olds, symptoms of depression were significantly reduced compared to those in the control group.

System to help stroke sufferers repair brain connections

Scientists at the University of Southern California have developed a stroke rehabilitation system that uses virtual reality (VR) and an EEG-based (electroencephalogram) brain-computer interface called REINVENT (Rehabilitation Environment using the Integration of Neuromuscular-based Virtual Enhancements for Neural Training). The system uses virtual reality to show the patient information from muscle and brain sensors – if the patient activates the sites in the brain associated with moving an arm, the virtual arm in the VR world moves. Over time, the patient could train the stroke-damaged circuits to work again. The system has only been tested on healthy older adults at this stage, but the scientists plan to test it on stoke sufferers within the next six months.

mindBEAGLE releasing a locked-in patient

mindBEAGLE, by Huger Technologies, Austria, is a system that attempts to help medical professionals assess the awareness and consciousness of an unresponsive patient. The technology could provide assistance to doctors that are working with patients in a coma, vegetative state, that are minimally conscious or have locked-in syndrome. mindBEAGLE uses auditory and vibrotactile stimulation to assess a patient’s condition. The auditory awareness is tested by playing sounds to the patient, and the vibrotactile tests stimulate the body with touch. An EEG (electroencephalogram) cap reads brain activity during the procedure, indicating patient awareness. The neural activity could also be used to communicate with the patient in simple yes/no fashion with the patient being asked to focus on the left-hand vibration if they wish to say yes and the right if no.

O6: ‘eye-free’ smartphone control

O6 is a remote smartphone controller for the iPhone, iPad and iPod Touch, that the company behind it, Fingertip Labs, is calling an ‘eyes-free’ device. The screen-less 40x11mm puck-like dial uses Bluetooth to connect to the user’s Apple device. The user can twist and click it to scroll through menus and lists, to listen to text, emails, news, and music routed through Bluetooth-enabled speakers or a headset. The device also enables the user to respond to messages or pull-up directions, and answer or make calls. Input commands to perform different functions can be customised using the O6 app.

Computer model of nerve responses could lead to better tactile feedback

Scientists at the University of Chicago, USA, have developed a computer model that simulates the response of nerves in the hand to any pattern of touch stimulation on the skin. The scientists say that the model will allow them to build a map of how the 12,500 nerve fibres in the hand respond when one interacts with objects – allowing them to build the results into realistic sensations in future bionic hands for amputees. The model could provide engineers with specific information on nerve responses when one comes into contact with certain objects which can then be electrically simulated through stimulation of the nerve through an implanted interface. The software will soon be available as a free download.

Real time gesture input could lead to more natural interactions

Researchers at Carnegie Mellon University (CMU) have enabled a computer to be able to understand peoples’ body gestures in video in real time. It can also interpret the pose of individual fingers. The system was developed in CMU’s Panoptic Studio where there are 500 cameras arranged in a dome shaped structure which capture each movement from 500 different angles, enabling the writing of algorithms that can more naturally understand a moving gesture using only one camera and a personal computer. The ability the researcher’s system shows could allow for more natural computer input methods such as pointing at things. It could also enable computer vision and AI systems to better understand non-verbal communication, granting them the ability to better perceive what people are doing or how they are feeling, for example. The computer code has been released and is being used by at least 20 commercial groups currently.

Desktopography: an interface projector

PhD students Robert Xiao, Scott Hudson and Chris Harris have developed a device that projects a device interface onto a surface such as a table where it can then be interacted with. The device, called Desktopography, uses an overhead projector equipped with a camera to throw an Android smartphone interface onto a surface. The projected display can be dragged to different locations on the surface, resized, minimized to icons, or rotated using multi-touch gestures. The virtual apps can be attached to real objects, so for example, a projected calculator app could be attached to the side of a laptop, and when the laptop is moved to another location on the table the calculator moves with it.

Optical neural probe advance for stimulation of neurons

Researchers from the IIT (Instituto Italiano di Tecnologia), Italy, and Harvard Medical School, USA, have developed an optical microprobe that can control electrical activity in the brain by projecting light onto selected areas of it. The technology has uses in optogenetics – using light beams to activate or inhibit neuronal activity. The device overcomes problems faced in controlling light propagation within brain tissue. It is made of a cone-shaped optical fibre with a 500nm tip. Through altering the angle of incoming light, the light-emitting portion of the device enlarges or contracts without needing to move the device itself. The work was part of the MODEM project, funded by the ERC (European Research Council), that aims to develop a minimally invasive device, that enables the direct control and monitoring of neural functioning.

Add this: