Database Open Access
MICRO Motion capture data from groups of participants standing still to auditory stimuli (2015)
Victor Gonzalez , Agata Zelechowska , Alexander Refsum Jensenius
Published: Dec. 8, 2020. Version: 1.0
When using this resource, please cite:
(show more options)
Gonzalez, V., Zelechowska, A., & Jensenius, A. R. (2020). MICRO Motion capture data from groups of participants standing still to auditory stimuli (2015) (version 1.0). PhysioNet. https://doi.org/10.13026/dfv0-sb95.
Please include the standard citation for PhysioNet:
(show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.
Abstract
This project contains head movement data recorded from groups of participants asked to stand as still as possible and presented with a series of auditory stimuli. Data was collected in units of mm with a Qualisys motion capture system at 100Hz. Data was collected at the University of Oslo on March 12th 2015 from a total of 108 participants. Code to read and process these files has been made publicly available.
Background
This study was conducted as a continuation of a previous project which aimed to investigate the effects of music stimuli on movement when participants try to remain at rest [1]. As part of this larger MICRO project, we collected data through optical motion capture from groups of people instructed to stand as still as possible with and without music stimuli.
Unlike data collected in 2012 [1], participants were asked to stand as still as possible for 6 minutes, starting with 60 seconds in silence and followed by alternated 60-second segments of music and silence. All trials started and ended with 60 seconds of silence. The automatic randomization of stimuli allowed for consecutive segments of silence and music. A detailed list of the stimuli presented to each group of participants is included in this dataset. By alternating the presentation of stimuli and introducing silence segments between music segments we aimed at validating the differences between music and silence conditions found in the 2012 dataset [1].
Methods
To summarize, we recruited 108 participants during the Open Day at the University Oslo after approval by the Norwegian Center for Research Data (NSD), with project identification number NSD2457. Participants were asked to stand as still as possible for 6 minutes, starting with 60 seconds in silence and followed by alternated 60-second segments of music and silence. All trials ended with 60s of silence. Participants were aware that music could start after one minute, and were free to choose their standing posture. The distribution of participants in the recording space was standardized across trials with marks on the floor indicating the approximate feet position.
The instantaneous position of a reflective marker placed on the head of each participant was recorded using a Qualisys infrared motion capture system (13 Oqus 300/500 cameras) running at 100 Hz. The data were recorded in 12 groups of 3 to 12 participants at a time. The motion capture system was triggered and stopped automatically with the stimuli playback system, thus all recordings are exactly 6 minutes long.
Experimental Stimuli
The 60 second stimuli used in the experiment and the presentation order for each group were: A = Silence; B = Meditation; C = Salsa; D = Electronic Dance Music (EDM).
Presentation Order
The audio stimuli were presented to groups in the orders described below.
01: ACABDA
02: AABDCA
03: AABCDA
04: ABCDAA
05: AABCDA
06: ACABDA
07: ACBDAA
08: ACBDAA
09: ACABDA
10: ACDBAA
11: AABDCA
12: ACABDA
Questionnaire
In a post-experiment questionnaire, participants were asked to self-report demographics and details such as whether they were standing with their eyes open or closed, and whether they had their knees locked.
Data Description
The following data types are provided:
- Motion (head marker position): Recorded with Qualisys Track Manager and saved as
.tsv
files. Data from each group of participants is saved in a separate file. - Stimuli: audio
.wav
files for each of the stimuli described above. - Demographics: descriptive data collected from participants in a post-experiment questionnaire, includes age, sex, music listening habits, knee posture and whether eyes were closed or open during the experiment. A value of
0
indicates that the participant answered "no" to the question; a value of1
is "yes"; and a value of0.5
indicates that the participant reported changing between states. For example,0.5
in theLocked knees?
column means that the participant reported switching between open and locked knees.
Usage Notes
Python and MATLAB code to process the data as well as the Max/MSP patch used to play and synchronize stimuli with the motion capture system is available on GitHub [2].
We encourage others to validate our work and build on it by applying novel analytical methods. In particular, between-group differences and interpersonal synchronization within groups. Alternative approaches can include clustering, music information retrieval, and frequency analysis of both movement and sound data.
Acknowledgements
This work was partially supported by the Research Council of Norway through its Centres of Excellence scheme (project numbers 262762 and 250698).
Conflicts of Interest
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
References
- Gonzalez, V., Zelechowska, A., & Jensenius, A. R. (2020). MICRO Motion capture data from groups of participants standing still to auditory stimuli (2012) (version 1.0). PhysioNet. https://doi.org/10.13026/4h9d-gf10.
- Code for processing and analysing data for the MICRO project. https://github.com/fourMs/MICRO
Parent Projects
Access
Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.
License (for files):
Creative Commons Attribution 4.0 International Public License
Discovery
DOI (version 1.0):
https://doi.org/10.13026/dfv0-sb95
DOI (latest version):
https://doi.org/10.13026/gdxk-7395
Topics:
sound
music cognition
movement
Project Website:
https://www.uio.no/ritmo/english/projects/micro/
Corresponding Author
Files
Total uncompressed size: 175.2 MB.
Access the files
- Download the ZIP file (64.2 MB)
- Access the files using the Google Cloud Storage Browser here. Login with a Google account is required.
-
Access the data using the Google Cloud command line tools (please refer to the gsutil
documentation for guidance):
gsutil -m -u YOUR_PROJECT_ID cp -r gs://music-motion-2015-1.0.physionet.org DESTINATION
-
Download the files using your terminal:
wget -r -N -c -np https://physionet.org/files/music-motion-2015/1.0/
-
Download the files using AWS command line tools:
aws s3 sync --no-sign-request s3://physionet-open/music-motion-2015/1.0/ DESTINATION
Name | Size | Modified |
---|---|---|
Parent Directory | ||
questionnaire_nm15.pdf (download) | 28.7 KB | 2020-10-08 |
self_report_nm15.csv (download) | 3.5 KB | 2020-12-07 |