Karsten gave me a couple of new assignments last week:
– export the matrices to LaTeX in order to include references. This would give a better idea of the relative importance of a non-empty cell, since the number of references inside it could be an indicator of that importance.
– build a small prototype Android application that presents the user with a basic interface for emotion detection. Since there is no detection algorithm available yet, these emotions may be presented to the user at random at the moment. Afterwards, a song will be made available to play, based on the emotion that was detected.
– prepare a small presentation for our meeting with imec (the one I talked about last week) to inform them about the research, experiments and required resources.
As for the matrices, that task proved to be a lot more time consuming than I anticipated. This time, I not only needed to skim all the papers on emotion detection, I needed to read entire sections in great detail as well… Particularly the sections about the employed research procedure and the accuracy of the results was of great importance to me, since this is the part that could possibly open the door for future research like mine. Although I managed to extract a great deal of research references, there is still some work to be done regarding the matrix exportation to LaTeX, as you can see for yourself below. The references corresponding to the numbers in the matrices can be found at the bottom of this post. An overview of all references (including the ones not used in the matrices) can be found under the Bibliography tab, but beware of the different numbering.
Next, I managed to build the small prototype that Karsten asked me to, once again spending a couple of hours exploring some new Android features along the way, the most important one being media playback. Although I am quite happy with the resulting prototype, it is not yet able to play songs. Also, the interface looks clean and is easy to navigate, but next revisions will focus on a much smoother look and feel. A couple of screenshots can be found below.
Finally, there is the presentation for the meeting with Imec that I needed to prepare in order to make a good impression and inform the people from Imec about my research. On Friday, it turned out that only one person from Imec attended the meeting: Elena, a very friendly doctoral researcher. Joris (who arranged the meeting in the first place) was also there, as were Karsten and Katrien. Although I was a bit nervous, the presentation went well and everybody in the room was really excited about my research. Lucky for me, Elena showed a lot of interest and enthusiasm: mission accomplished! We went back to the slide about my additional sensor requirements and discussed several interesting possibilities. We came to the conclusion that it would be best to use an Empatica E4 Wristband (https://www.empatica.com/e4-wristband), containing a PPG, EDA and temperature sensor as well as an accelerometer. This device would then be able to measure BVP, skin conductance, skin temperature and movement. Elena also mentioned the use of an ECG sensor, the details of which still need to be discussed. Needless to say: the meeting was a great success!
 M. Adibuzzaman, N. Jain, N. Steinhafel, M. Haque, F. Ahmed, S. I. Ahamed, and R. Love. Towards in Situ Affect Detection in Mobile Devices: A Multimodal Approach. In Proceedings of the 2013 Research in Adaptive and Convergent Systems, RACS ’13, pages 454–460, New York, NY, USA, 2013. ACM.
 S. Alghowinem, M. AlShehri, R. Goecke, and M. Wagner. Exploring Eye Activity as an Indication of Emotional States Using an Eye-Tracking Sensor. In L. Chen, S. Kapoor, and R. Bhatia, editors, Intelligent Systems for Science and Information, number 542 in Studies in Computational Intelligence, pages 261–276. Springer International Publishing, 2014. DOI: 10.1007/978-3-319-04702-7_15.
 K. Asawa. Recognition of Emotions using Energy Based Bimodal Informa- tion Fusion and Correlation. International Journal of Artificial Intelligence and Interactive Multimedia, 2(Special Issue on Multisensor User Tracking and Analytics to Improve Education and other Application Fields):17–21, 2014.
 E. L. Broek. Ubiquitous Emotion-aware Computing. Personal Ubiquitous Comput., 17(1):53–67, Jan. 2013.
 S. Deng, D. Wang, X. Li, and G. Xu. Exploring user emotion in microblogs for music recommendation. Expert Systems with Applications, 42(23):9284–9293, Dec. 2015.
 C. Epp, M. Lippold, and R. L. Mandryk. Identifying Emotional States Using Keystroke Dynamics. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’11, pages 715–724, New York, NY, USA, 2011. ACM.
 S. H. Fairclough, A. J. Karran, and K. Gilleade. Classification Accuracy from the Perspective of the User: Real-Time Interaction with Physiological Computing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15, pages 3029–3038, New York, NY, USA, 2015. ACM.
 D. A. Gómez Jáuregui and J.-C. Martin. Evaluation of Vision-based Real- time Measures for Emotions Discrimination Under Uncontrolled Conditions. In Proceedings of the 2013 on Emotion Recognition in the Wild Challenge and Workshop, EmotiW ’13, pages 17–22, New York, NY, USA, 2013. ACM.
 B. Guthier, R. Alharthi, R. Abaalkhail, and A. El Saddik. Detection and Visualization of Emotions in an Affect-Aware City. In Proceedings of the 1st International Workshop on Emerging Multimedia Applications and Services for Smart Cities, EMASC ’14, pages 23–28, New York, NY, USA, 2014. ACM.
 A. Haag, S. Goronzy, P. Schaich, and J. Williams. Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System. In E. André, L. Dybkjær, W. Minker, and P. Heisterkamp, editors, Affective Dialogue Systems, number 3068 in Lecture Notes in Computer Science, pages 36–48. Springer Berlin Heidelberg, June 2004. DOI: 10.1007/978-3-540-24842-2_4.
 W. Han, H. Li, F. Eyben, L. Ma, J. Sun, and B. Schuller. Preserving Actual Dynamic Trend of Emotion in Dimensional Speech Emotion Recognition. In Proceedings of the 14th ACM International Conference on Multimodal Interaction, ICMI ’12, pages 523–528, New York, NY, USA, 2012. ACM.
 R. L. Hazlett. Measuring Emotional Valence During Interactive Experiences: Boys at Video Game Play. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’06, pages 1023–1026, New York, NY, USA, 2006. ACM.
 V. H.d, A. K.r, and K. K. Emotion Recognition from Decision Level Fusion of Visual and Acoustic Features Using Hausdorff Classifier. In K. R. Venugopal and L. M. Patnaik, editors, Computer Networks and Intelligent Computing, number 157 in Communications in Computer and Information Science, pages 601–610. Springer Berlin Heidelberg, 2011. DOI: 10.1007/978-3-642-22786-8_76.
 M. Khezri, M. Firoozabadi, and A. R. Sharafat. Reliable emotion recogni- tion system based on dynamic adaptive fusion of forehead biopotentials and physiological signals. Computer Methods and Programs in Biomedicine, 2015.
 S.-J. Kim, J.-S. Kim, S.-H. Kim, and Y.-M. Kim. Evolvable Recommenda- tion System in the Portable Device Based on the Emotion Awareness. In R. Khosla, R. J. Howlett, and L. C. Jain, editors, Knowledge-Based Intelligent Information and Engineering Systems, number 3682 in Lecture Notes in Com- puter Science, pages 251–257. Springer Berlin Heidelberg, Sept. 2005. DOI: 10.1007/11552451_34.
 L. I. Kuncheva, T. Christy, I. Pierce, and S. P. Mansoor. Multi-modal Biometric Emotion Recognition Using Classifier Ensembles. In Proceedings of the 24th International Conference on Industrial Engineering and Other Applications of Applied Intelligent Systems Conference on Modern Approaches in Applied Intelligence – Volume Part I, IEA/AIE’11, pages 317–326, Berlin, Heidelberg, 2011. Springer-Verlag.
 T. Lahti, M. Helén, O. Vuorinen, E. Väyrynen, J. Partala, J. Peltola, and S.-M. Mäkelä. On Enabling Techniques for Personal Audio Content Management. In Proceedings of the 1st ACM International Conference on Multimedia Information Retrieval, MIR ’08, pages 113–120, New York, NY, USA, 2008. ACM.
 F. Nasoz, K. Alvarez, C. L. Lisetti, and N. Finkelstein. Emotion recognition from physiological signals using wireless sensors for presence technologies. Cognition, Technology & Work, 6(1):4–14, Dec. 2003.
 J. Nicolle, V. Rapp, K. Bailly, L. Prevost, and M. Chetouani. Robust Continuous Prediction of Human Emotions Using Multiscale Dynamic Cues. In Proceedings of the 14th ACM International Conference on Multimodal Interaction, ICMI ’12, pages 501–508, New York, NY, USA, 2012. ACM.
 K. Rattanyu and M. Mizukawa. Emotion Recognition Using Biological Signal in Intelligent Space. In J. A. Jacko, editor, Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, number 6763 in Lecture Notes in Computer Science, pages 586–592. Springer Berlin Heidelberg, July 2011. DOI: 10.1007/978-3-642-21616-9_66.
 G. Rigas, C. D. Katsis, G. Ganiatsas, and D. I. Fotiadis. A User Independent, Biosignal Based, Emotion Recognition Method. In C. Conati, K. McCoy, and G. Paliouras, editors, User Modeling 2007, number 4511 in Lecture Notes in Computer Science, pages 314–318. Springer Berlin Heidelberg, July 2007. DOI: 10.1007/978-3-540-73078-1_36.
 V. N. Salimpoor, M. Benovoy, G. Longo, J. R. Cooperstock, and R. J. Zatorre. The Rewarding Aspects of Music Listening Are Related to Degree of Emotional Arousal. PLoS ONE, 4(10):e7487, Oct. 2009.
 N. K. Suryadevara, T. Quazi, and S. C. Mukhopadhyay. Smart Sensing System for Human Emotion and Behaviour Recognition. In M. K. Kundu, S. Mitra, D. Mazumdar, and S. K. Pal, editors, Perception and Machine Intelligence, number 7143 in Lecture Notes in Computer Science, pages 11–22. Springer Berlin Heidelberg, 2012. DOI: 10.1007/978-3-642-27387-2_2.
 K. Takahashi and I. Sugimoto. Remarks on Emotion Recognition Using Breath Gas Sensing System. In S. C. Mukhopadhyay and G. S. Gupta, editors, Smart Sensors and Sensing Technology, number 20 in Lecture Notes Electrical Engi- neering, pages 49–62. Springer Berlin Heidelberg, 2008. DOI: 10.1007/978-3- 540-79590-2_4.
 G. Valenza, A. Lanatá, and E. P. Scilingo. Improving emotion recognition systems by embedding cardiorespiratory coupling. Physiological Measurement, 34(4):449–464, Apr. 2013.
 M. Vanny, S.-M. Park, K.-E. Ko, and K.-B. Sim. Analysis of Physiological Signals for Emotion Recognition Based on Support Vector Machine. In J.-H. Kim, E. T. Matson, H. Myung, and P. Xu, editors, Robot Intelligence Technology and Applications 2012, number 208 in Advances in Intelligent Systems and Computing, pages 115–125. Springer Berlin Heidelberg, 2013. DOI: 10.1007/978- 3-642-37374-9_12.
 P. A. Vijaya and G. Shivakumar. Galvanic Skin Response: A Physiological Sensor System for Affective Computing. International Journal of Machine Learning and Computing, pages 31–34, 2013.
 B. P. Woolf, I. Arroyo, D. Cooper, W. Burleson, and K. Muldner. Affective Tutors: Automatic Detection of and Response to Student Emotion. In R. Nkam- bou, J. Bourdeau, and R. Mizoguchi, editors, Advances in Intelligent Tutoring Systems, number 308 in Studies in Computational Intelligence, pages 207–227. Springer Berlin Heidelberg, 2010. DOI: 10.1007/978-3-642-14363-2_10.