-
Notifications
You must be signed in to change notification settings - Fork 2
Data structure
The Musical Gestures Toolbox uses only one struct
data structure. This structure contains three fields: video, audio, and mocap, which corresponds to these three types of data, respectively. An MGT data structure is created with the function:
mg=mginitstruct;
This produces a structure mg
with three fields. The video field (mg.video) contains data and general parameters of the video file:
- .obj: stores each frame of the video
- .gram: after running the function
mgmotion
, motiongram data are written to.gram.x
and.gram.y
. - .qom: quantity of motion of each frame
- .com: centroid of motion of each frame
- .nframe: the number of frames in the video file. The framerate of the video is stored in the field framerate. This is most often 25 or 30 frames per second. Duration field is for the length of the video. The framerate of the video is stored in the field framerate. Usually, it is 30 frames per second. Duration field is for the length of the video.
- .method: the toolbox uses two general methods to estimate the motion: either based on frame differencing ('Diff') or optical flow ('OpticalFlow').
The data structures for mocap and audio are copied from the MoCap and MIR toolboxes.
One of the useful things of using structures, is that you can also find information about the file names and types inside of the structures. This makes it possible to pass on this information in different ways, for example:
mgOut = mgmotion(mg);
outputFileType = mgOut.output.type; %will show motion
outputFileAddress = mgOut.output.motion.filename;
mgOut = mgmotion(mg, 'OpticalFlow');
outputFileType = mgOut.output.type; %will show opticalFlow
outputFileAddress = mgOut.output.opticalFlow.filename
A project from the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, Department of Musicology, University of Oslo.