Driving simulator environment

The driving simulator environment is developed in-house. Realistic cabin is constructed on a motion platform and equipped with real pedals, gearshift and steering wheel with force feedback. Simulator software (SCANeR Studio, AVSimulation)  is used to create realistic scenarios of various traffic situations.  Virtual environment can be shown on screens or via virtual reality headset. Surround sound system and darkened room is used ensure immersive experience. Motion platform simulates g-forces at the accelerations, decelerations and curved during driving.

In the simulator enviroment various biosignals, such as EMG, ECG and galvanic skin response (GSR) can be measured. In addition, driver’s eye movements can be tracked.  Furthermore, thermal imaging can be used to detect temperature changes on driver’s face.  In a pilot study, a gourp of professional and non-professional elderly drivers were measured at simulator.  Differences were observed in eye movements and use of gear stick between the groups.

Kinematics of the driver can be measured using wearable IMUs, joint angles from ankle to neck and wrist can be monitored.

Two different eye tracking systems are available a wearable system (Arrington Research) and a remote two-camera system Smart Eye XO. The Smart Eye system enabled tracking of gaze on objects of the simulation environment.

By combining the measurements and experience of the group in signal analysis driver’s physiological state can be monitored.

Driver's kinematics measurement in the simulator using wearable sensors
Driver’s kinematics measurered in the simulator using wearable IMUs. Musculosketal model of driver created in OpenSim environment.
Data collected in the driving simulator: Upper row, 2nd image: Eye tracking at the environment (green dot) Upper row, 3rd image: thermal imaging of driver’s face, bottom left: car parameters, bottom right: GSR, ECG and EMG signals.