02.11. Wrote my report for 2014-15 based on my diary and other info I had. Have to add more things from my diary that I did leave out.
Had a very good meeting with Øyvind were we went through the whole process. He asked if there were any technical difficulties meaning if the program was crashing meaning computer freeze or program freeze. I got bit confused in my Norwegian and told there was some things that were unstable. Then we went into a discussion about the difference of technical stability and programming stability.
After the meeting the following things are necessary to do:
Find a way to save recorded data from the mubu.hhmm gesture recognition program.
Fix confusion between Volume, Pan, Effect when making a hand gesture.
A possible solution to make a gate that is open if I do a specific gesture like closing my hand (all fingers closed in). Meaning the gesture recognition program is open. When one finger is set out it knows that it is …
Make more than 3 gestures recognition available.
Have to find a better solution for the conductor to be sure what is happening if he wants to change volume or pan or effects. Also is it necessary for him to do it for each track or just the master track?
Have to make a gate that switches between the conductor adding to the overall volume of the selected track, overall volume of all tracks, and if he is having a total control as if he is doing his conducting solo performance.
Do something about the exponential function of the volume as it is now it starts with very little volume changes and then big changes at the end. Might be fixed with an inverted exponential curve (suggestion ØB)
√ Calibration of the fingers for ConGlove have to be made simpler so I only need to press one button instead of five for each finger.
Made version connote 1.6 E1.
Changed the volume from max gain object to live.gain object. Remember to scale all volumes, still problem with an exponential curve.
03.11. Send an e-mail to Jules Francoise with questions about how to save recorded gestures and how to calculate the tempo of conducting gestures.
04.11.
Dear Hilmar,
Thank you for contacting me and for the interest!
I am replying from my current professional email address.
Great news to see that you are applying mubu.hhmm for recognizing conductors’ gestures!
My first question is, therefore: Is there a way to SAVE the recorded gestures for use “next” time when the program is loaded?
You have 2 ways to save and load: you can either save the gestures in mubu or save the recognition models:
- You record all gestures in different buffers of the mubu container. You can then use mubu write/read function to communicate with files. For example, use writeall to save all the container (all gestures) to a mubu file. You can reload them with readall, but you will then need to send the message train after reading data in order to train the mubu.hhmm object so that it learns from this training data.
- The other option is to use the file I/O functions in mubu.hhmm. Once a model is trained, you can save it using the write message, and reload it later using read.
Of course, you can also combine both methods (just to be sure you keep the original gestures, or in case you want to play with the parameters).
My second question is, therefore: Is there a way to measure the average tempo of the conducting gesture?
My first idea for this is to use the time progression estimated by the object. If you look at the example hmm_following, you will see that the object outputs an estimation of the time progression at each time step (the normalized time index within the reference gesture). You should be able to use this to derive an estimation of the tempo (relatively to the reference tempo).
Best,
Jules
05.11. Home office day. Reading Jules email and trying to figure out his suggestions about saving and measure the average tempo.
06.11. Back at the office working on the saving factor of ConDiS.
Found out how to do the savings and finished a nice setup for that. Needs to be intertwined with the bpatcher object.
Done with the write all and save all message. Found out that I need to do the following:
09.11. Started by cleaning up the readall, writeall.
Worked on a way to add and subtract overall volume. Have to find a way to start and stop the volume change. One way is to have finger signs for start and stop and then a gesture up and down to adjust the volume. Did make convolume patch to work on the volume change programming.
10.11. Started by making a window close message for the Hadron window. A useful thing to avoid to have too many windows open during the performance.
Did put a slide object in Volume, not sure if it does work as I thought it would be. The volume slider still seems to be stuttering.
Did a lot of cleaning up. Especially around the finger selection made a bpatch and looks cleaner, still no solution to the volume control thing. Don’t know what to do….think think think.
11.11. Skrivekurs at Dragvoll. Very useful and promising need to do something to get me going in the writing process. Have to do a short essay for January where I probably will work on the role of the conductor, his gestures and perhaps some comparison between one style of conducting and other. Or I might stick with my idea of writing about the conductor as a social figure – the Maestro -.
Had some terrible headache and went home to bed after the skrivekurs around 14:00
12.11. Back to work around 11:00 after a bunch of painkillers to get rid of a headache not feeling too good but have to work since I have to finish my finance from last traveling period i.e. Kristiansand and Aberdeen. How happy I am not having to travel for next couple of months. Did start my work by thinking about the gesture selection and passing problem. My main task is to find out how I can tell the computer to react to one gesture by opening up that gate and stay like that until another gesture tells it to close and open up another gate (volume, pan, effect). The problem is that I do get the information when for instance making the number 1 gesture (up/down) but thereafter I still get the other gestures signs and the program accidentally turned on number 2 or number 3 gesture. Therefore I have to find a way to open up number 1 gesture and keep the other closed until they get their gesture. Will ventrally find it out.
16.11. Worked on the v p e switch, found out a solution that might be ok for the first version have to write more about the process tomorrow, running out of time already 17:09
Working on the interface for ConDiS. Examining the use of Toggle buttons to change colors.
17.11. Have to find out how to change colors of the Volume, Pan, Effect selector bpatch. Also how to start volume change the volume that is playing.
Pan control: If conductor controlls the spread or with of the pan, then use a slider from 0 -1 like
19.11. Volume comes in from the score object into router and
23.11.
50 + 3 X = volume from score
53 + 0 Y = volume from ConGlove
ON puts Y to 0 point
OFF stores the latest incoming number
NB! scale object for incoming ConGlove mess. Cange object for incoming on/off from toggle.
incoming message from VolScore is variable 0-127. When ConGlove VolChange is turned on then it needs to start at 0. point and go to max +/-. If VolChange is turned off then it needs to get the latest VolChange number.
27.11. Had to give up finding out a way to add volume to each track + master (all) tracks… Things to do after the concert on Thursday (1.des). Important rehearsal tonight and have to prepare for that.
Things to do:
Make a new Breadboard for the Glove the middle finger not working properly. Put up the multi-channel studio.