Artificial Intelligence (AI) in Healthcare, Augmented Human, Augmented Human International Conference, Augmented Humanity, Augmented Reality (AR), Eye-Tracking-Driven Sonification for the Visually Impaired, Life, LifeLog, Lifelog System for Detecting Psychological Stress, Stress, Wearable devices
Lifelog system enable us to measure biological information at all times with wearable devices.
Augmented Reality(AR) is an environment where a real life is enhanced by virtual elements in real time. The purpose of AR is to enhance the information we naturally receive through our five senses, by adding superimposed, constructed virtual elements to bring complementary information and meaning that may not be possible to see by natural means.
As interfaces progress beyond wearables and intrinsic human augmentation, the human body has become an increasingly important of the Augmented Human International Conferences Series. Wearables already act as a new layer of functionality located on the body that leads us to rethink the convergence between technology and fashion, not just in terms of the ability to wear, but also in how devices interact with us.
Already, several options for wearable technology have emerged in the form of clothing and accessories. However, by applying sensors and other computing devices directly onto the body surface, wearables could also be designed as skin interfaces.
Here is brief review the wearability factors impacting wearables as clothes and accessories in order to discuss them in the context of skin interfaces. We classify these wearability factors in terms of body aspects (location, body movements and body characteristics) and device aspects (weight, attachment methods, accessibility, interaction, aesthetics, conductors, insulation, device care, connection, communication, battery life). The factors in the context of two different example skin interfaces: a rigid board embedded into special effects makeup and skin-mounted soft materials connected to devices.
Scientific contributions towards augmenting humans capabilities through technology for increased “well-being” and “enjoyable human experience.” The topics include, but are not limited to: Brain-Computer Interface (BCI), Muscle Interfaces and Implanted Interfaces The Human International Conference; Wearable Computing and Ubiquitous Computing; Augmented and Mixed Reality; Human Augmentation, Sensory Substitution and Fusion; Hardware and Sensors for Augmented Human Technologies; Safety, Ethics, Trust, Privacy and Security Aspects of Augmented Humanity.
Wearability Factors for Skin Interfaces 
There are two aspects of wearable skin interfaces to consider.
Body Aspect: location, body movements and body characteristics
Device Aspect: attachment methods, weight, insulation, accessibility, communication, interaction, aesthetics, conductors, device care and connection, battery life.
A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors 
In this presentation a lifelog system enable us to measure biological information at all times with wearable devices. This experiment was done by using a glass that measures nasal skin temperature and makes a video at the same time. With that information the team could identify stress situation.
Augmented Visualization for Guiding Arm Movement in the First-Person Perspective 
The motivation behind the Guiding Arm Movement is to learn physical activities Tai-Chi. It can also be used to learn any other movements. The user wears an AR-glass and sees the movement of the body as an augmented shape of body parts. Occlusion caused by other students or objects becomes irrelevant as there is no more a traditional trainer showing the movements.
Physical activities can be learned in two steps:
1. We learn the new movement roughly, i.e. roughly learn the complete form of moves.
2. We know the basic movements and we need to learn the detail by correcting the small deviation.
Exploring Eye-Tracking-Driven Sonification for the Visually Impaired 
The idea is that the user can decide which information is relevant. The control is done by tracking the eye movements of the users exploration field. The device can play sounds for color, text and facial expression.
By color sonification, the color will be mapped with instruments and the pitch represents the brightness.
If there appears text in the Eye-Tracker it will be mapped to spoken sounds. With the Pitch and with Stereo it can be located in a 2D position (See picture below).
Facial Expressions mapped similar as color recognition to instruments.
More: Stress is extremely harmful to one’s health. It is important to know which situations or events cause us to feel stressed: if we know the factors behind the stress, we can take corrective action. However, it is hard to perceive stress in everyday life by ourselves. Automatically detecting stress from biological information is one method for dealing with this. Stress is generally detected by using a physiological index pulse, brain activity, and breathing in order to ensure universality and accuracy.
This biological information reacts to sudden stressors, not chronic stressors. However, it is difficult to use measuring devices for such data in everyday life because the devices require expertise for operation and are expensive. Our goal in this study is to develop a Lifelog System featuring glass-equipped sensors that can be used on a daily basis. We detect stress by examining nasal skin temperature, which is decreased by sudden stressors. In order to investigate the recognition accuracy of the proposed system, we performed experiments at the scenes of feeling stress. Results showed that the system can distinguish factors other than stress from the change in nasal skin temperature with sufficient precision.
Moreover, we investigated the optimum locations to attach temperature sensors to ensure that they have both reactivity and comfort. We also implemented an application for analyzing the measured data. The application calculates the time at which a user feels stress by analyzing the measured data and extracts a stressful scene from a video recorded from the point of view of the user.
 Wearability Factors for Skin Interfaces.Xin Liu, Katia Vega, Pattie Maes, Joe A. Paradiso. MIT Media Lab
 A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors. Hiroki Yasufuku, Tsutomu Terada, Masahiko Tsukamoto
AR-Arm Augmented Visualization for Guiding Arm Movement in the First-Person Perspective. Ping-Hsuan Han, Kuan-Wen Chen, Chen-Hsin Hsieh, Yu-Jie Huang, and Yi-Ping Hung
 Exploring Eye-Tracking-Driven Sonification for the Visually Impaired. Michael Dietz, Maha El Garf, Ionut Damian, Elisabeth André
Rationale to start the Journal:
The Augmented Human (AH) International Conference Series started in
2010 and are attracting more and more researchers and industrials.
“The total Augmented Human Market is expected to reach up to $1135 million by 2020”
Augmented Human Research Groups, Centre or Labs around the World:
- Augmented Human Research Center (AHRC), Prof. Woontack Woo, KAIST, Korea: LINK
- Augmented Human Lab, Ass. Prof. Suranga Nanayakkara, SUTD, Singapore: LINK
- Augmented Human Trust Research Group, Ass. Prof. Jean-Marc Seigneur, University of Geneva: LINK
- Augmentation and Training of Humans with Engineering in North America (ATHENA) Lab, Prof. Stone, Iowa State University, USA: LINK
“VIRTUAL ENVIRONMENTS AS SITUATED TECHNO- SOCIAL PERFORMANCES Virtual West Cambridge case-study,” 2010.
Exploring Eye-Tracking-Driven Sonification for the Visually Impaired