VizMove PRISM Enhanced Simulation Training Rooms

PRISM is an enhanced, immersive, simulation theater. It is an integration of wrap-around visualizations, surround sound, controllable lighting, and content interaction. Available as single, dual, triple, or quad configurations. Immerse participants in realistic and specific scenarios to improve preparedness without exposure to hazards. Add physiological measures (ECG, RSP, EDA, EEG, fNIRS, etc.) and synchronize with events in the training to assess the individual’s response. Learn more about PRISM projection VR systems.

VizMove PRISM Immersive Training Environments

PRISM is a fully-integrated projection VR solution deployed at the push of a button. Bring contextual training to the safety of your learning space. Standardize training with realistic scenarios, distractions, and pressures. Add physiological measures (ECG, RSP, EDA, EEG, fNIRS, etc.) and synchronize with events in the training to assess the individual’s response. Learn more about PRISM projection VR systems.

VR | Biofeedback and VR

This script presents a biofeedback example showing how physiology data from AcqKnowledge can change the Vizard display. It is set up to be driven by respiration as a default (but you could apply any signal type) so that as we inspire a number of tiles arranged in a grid turn various shades of blue, and during expiration, they turn various shades of red. The intensity of the color changes/transparency levels of the tile are influenced both by incoming physio data and a random generator. Additionally, an object (currently a cartoon balloon fish) moves up and down depending on  incoming data. There is also a capability to display different 2D images depending on signal state. Script 058 download contains setup instructions, sample data file, and BIOPAC Basic Script.

VR | Feedback example: Dynamometry (MVC trace)

In this BIOPAC Basic Scripting example, the code allows the following experiment to be performed: Maximum Voluntary Contraction (MVC) of a participant is recorded; the participant exerts forces so that the force signal approximates a target path; the target path is specified ahead of time by the researcher via a text file and is displayed in full. A text file is produced with the force data and the target path data so it can be loaded and analyzed in AcqKnowledge. Script 060 download contains setup instructions, sample data file, and BIOPAC Basic Script. Find additional samples at VR Resources.

VR | File Format for fNIRS Data Recorded with COBI Modern

This provides a quick overview of the data format generated by COBI Modern software and explains the four file types stored in Documents > COBI Studio Modern > Data > project: OXY (deoxygenation and oxygenation for every channel), NIR (row intensities for different wavelengths), MARKER (timing and label), and LOG. All files are CSV or text and can be easily opened and edited. To receive the data used, complete this request for sample fNIRS data recorded with COBI Modern. For additional info, see VR Resources.

VR | HMDs with Sensors for fNIRS and B-Alert

Head mounted displays (HMD) are often utilized in virtual reality protocols. Measures such as fNIRS, EEG, Cognitive State, and Workload are also useful and sensors for such measurements must be placed on the participant’s head along with the HMD. Maintaining participant comfort as well as field of view is critical for such studies to provide meaningful results. The comfort and fit of an HMD with additional sensor(s) was tested. Testing with fNIRS and B-Alert sensors did not block the field of view of an individual wearing either device in a VR headset. Learn more about using head sensors with HMD  and see VR Resources.

VR | How VR Can Improve Market Research

Alex Dimov, BIOPAC Sales Executive, was interviewed at the Neuromarketing World Forum by Inside Marketing to discuss how virtual reality can improve marketing research. Alex explains how VR can be used to present virtual iterations of a product to save time and money, and notes that measuring physiological responses adds lots of information for product and messaging refinement. (The interview  is in English.) See VR Resources for sample applcaitons and download 10 Best Practices for Measuring Emotions.

VR | Multi-signal, Multi-subject Streaming in Vizard

This demonstrates how to integrate BIOPAC data into the virtual world. The example shows how to stream fNIR data and physiological data (heart rate) from four participants using tools in COBI imaging software, AcqKnowledge research software, and Vizard VR software. Find additional samples at VR Resources.

VR | Sending Markers from Vizard 7 and Python 3 to COBI Modern

This provides a quick overview of how to send markers from Vizard 7 and Python 3 to COBI Modern. The computers have to be on the same network (can be on the same computer or two separate computers). This shows the initial set of instructions that are required and the “send markers’ command. This explains how to add markers from a live or virtual (playback) recording. To receive the Python code used for this task, complete this request for PY code. For additional info, see VR Resources. If running older versions, see Vizard 6/Python 2 to COBI.

VR | Unity VR Interface for AcqKnowledge

Immerse your users and capture biofeedback for analysis! Use Unity® Interface for AcqKnowledge®  (UNITY-INTERFACE ) to easily connect Unity3D projects with BIOPAC acquisition hardware and analysis software. Create virtual environments using industry-standard Unity, connect and configure projects with AcqKnowledge in real time; control acquisition from Unity to Custom Markers, Digital, and Analog I/O; deploy to your devices. Find additional samples at VR Resources.