Bibliography

[1] Google Inc., “Android Compatibility Definition Document.” .

[2] J. R. J. Fontaine, K. R. Scherer, E. B. Roesch, and P. C. Ellsworth, “The World of Emotions is not Two-Dimensional,” Psychological Science, vol. 18, no. 12, pp. 1050–1057, Dec. 2007.

[3] M. Pielot, T. Dingler, J. S. Pedro, and N. Oliver, “When Attention is Not Scarce – Detecting Boredom from Mobile Phone Usage,” in Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, New York, NY, USA, 2015, pp. 825–836.

[4] E. L. Broek, “Ubiquitous Emotion-aware Computing,” Personal Ubiquitous Comput., vol. 17, no. 1, pp. 53–67, Jan. 2013.

[5] M. Paleari and C. L. Lisetti, “Toward Multimodal Fusion of Affective Cues,” in Proceedings of the 1st ACM International Workshop on Human-centered Multimedia, New York, NY, USA, 2006, pp. 99–108.

[6] V. N. Salimpoor, M. Benovoy, G. Longo, J. R. Cooperstock, and R. J. Zatorre, “The Rewarding Aspects of Music Listening Are Related to Degree of Emotional Arousal,” PLoS ONE, vol. 4, no. 10, p. e7487, Oct. 2009.

[7] R. E. Krout, “Music listening to facilitate relaxation and promote wellness: Integrated aspects of our neurophysiological responses to music,” The Arts in Psychotherapy, vol. 34, no. 2, pp. 134–141, 2007.

[8] R. W. Levenson, “The Autonomic Nervous System and Emotion,” Emotion Review, vol. 6, no. 2, pp. 100–112, Apr. 2014.

[9] D. Su and P. Fung, “Personalized Music Emotion Classification via Active Learning,” in Proceedings of the Second International ACM Workshop on Music Information Retrieval with User-centered and Multimodal Strategies, New York, NY, USA, 2012, pp. 57–62.

[10] T. Lahti, M. Helén, O. Vuorinen, E. Väyrynen, J. Partala, J. Peltola, and S.-M. Mäkelä, “On Enabling Techniques for Personal Audio Content Management,” in Proceedings of the 1st ACM International Conference on Multimedia Information Retrieval, New York, NY, USA, 2008, pp. 113–120.

[11] D. C. Correa, J. H. Saito, and L. da F. Costa, “Musical genres: beating to the rhythms of different drums,” New J. Phys., vol. 12, no. 5, p. 053030, 2010.

[12] C. Orellana-Rodriguez, E. Diaz-Aviles, and W. Nejdl, “Mining Affective Context in Short Films for Emotion-Aware Recommendation,” in Proceedings of the 26th ACM Conference on Hypertext & Social Media, New York, NY, USA, 2015, pp. 185–194.

[13] T. Mioch, T. R. A. Giele, N. J. J. M. Smets, and M. A. Neerincx, “Measuring Emotions of Robot Operators in Urban Search and Rescue Missions,” in Proceedings of the 31st European Conference on Cognitive Ergonomics, New York, NY, USA, 2013, pp. 20:1–20:7.

[14] R. L. Hazlett, “Measuring Emotional Valence During Interactive Experiences: Boys at Video Game Play,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2006, pp. 1023–1026.

[15] I. B. Mauss and M. D. Robinson, “Measures of emotion: A review,” Cogn Emot, vol. 23, no. 2, pp. 209–237, Feb. 2009.

[16] M. Adibuzzaman, N. Jain, N. Steinhafel, M. Haque, F. Ahmed, S. Ahamed, and R. Love, “In Situ Affect Detection in Mobile Devices: A Multimodal Approach for Advertisement Using Social Network,” SIGAPP Appl. Comput. Rev., vol. 13, no. 4, pp. 67–77, Dec. 2013.

[17] K. Wakil, R. Bakhtyar, K. Ali, and K. Alaadin, “Improving Web Movie Recommender System Based on Emotions,” International Journal of Advanced Computer Science and Applications, vol. 6, no. 2, 2015.

[18] M. Lee, K. Kim, H. Rho, and S. J. Kim, “Empa Talk: A Physiological Data Incorporated Human-computer Interactions,” in CHI ’14 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2014, pp. 1897–1902.

[19] G. B. Gil, A. Berlanga, and J. M. Molina, “EmotionContext: User Emotion Dataset Using Smartphones,” in Ambient Assisted Living and Home Care, J. Bravo, R. Hervás, and M. Rodríguez, Eds. Springer Berlin Heidelberg, 2012, pp. 371–374.

[20] C. Peter and A. Herbon, “Emotion representation and physiology assignments in digital systems,” Interacting with Computers, vol. 18, no. 2, pp. 139–170, Mar. 2006.

[21] J. Jaimovich, N. Coghlan, and R. B. Knapp, “Emotion in Motion: A Study of Music and Affective Response,” in From Sounds to Music and Emotions, M. Aramaki, M. Barthet, R. Kronland-Martinet, and S. Ystad, Eds. Springer Berlin Heidelberg, 2012, pp. 19–43.

[22] A. Jamdar, J. Abraham, K. Khanna, and R. Dubey, “Emotion Analysis of Songs Based on Lyrical and Audio Features,” International Journal of Artificial Intelligence & Applications, vol. 6, no. 3, pp. 35–50, May 2015.

[23] M. A. Nicolaou, S. Zafeiriou, and M. Pantic, “Correlated-spaces Regression for Learning Continuous Emotion Dimensions,” in Proceedings of the 21st ACM International Conference on Multimedia, New York, NY, USA, 2013, pp. 773–776.

[24] Q. Z. Jiwei Qin, “An Emotion-oriented Music Recommendation Algorithm Fusing Rating and Trust,” International Journal of Computational Intelligence Systems, vol. 7, no. 2, pp. 371–381, 2013.

[25] K. Church, E. Hoggan, and N. Oliver, “A Study of Mobile Mood Awareness and Communication Through MobiMood,” in Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, New York, NY, USA, 2010, pp. 128–137.

[26] H. A. P. G. Derick Leony, “A Generic Architecture for Emotion-based Recommender Systems in Cloud Learning Environments,” Journal of Universal Computer Science, vol. 19, no. 14, pp. 2075–2092, 2013.

[27] D. Griffiths, S. Cunningham, and J. Weinel, “A Discussion of Musical Features for Automatic Music Playlist Generation Using Affective Technologies,” in Proceedings of the 8th Audio Mostly Conference, New York, NY, USA, 2013, pp. 13:1–13:4.

[28] E. Lee, G.-W. Kim, B.-S. Kim, and M.-A. Kang, “A Design Platform for Emotion-Aware User Interfaces,” in Proceedings of the 2014 Workshop on Emotion Representation and Modelling in Human-Computer-Interaction-Systems, New York, NY, USA, 2014, pp. 19–24.

[29] K. Asawa, “Recognition of Emotions using Energy Based Bimodal Information Fusion and Correlation,” International Journal of Artificial Intelligence and Interactive Multimedia, vol. 2, no. Special Issue on Multisensor User Tracking and Analytics to Improve Education and other Application Fields, pp. 17–21, 2014.

[30] J. Nicolle, V. Rapp, K. Bailly, L. Prevost, and M. Chetouani, “Robust Continuous Prediction of Human Emotions Using Multiscale Dynamic Cues,” in Proceedings of the 14th ACM International Conference on Multimodal Interaction, New York, NY, USA, 2012, pp. 501–508.

[31] J. Rojahn, F. Gerhards, S. T. Matlock, and T. L. Kroeger, “Reliability and validity studies of the Facial Discrimination Task for emotion research,” Psychiatry Research, vol. 95, no. 2, pp. 169–181, Aug. 2000.

[32] J. Haberman and D. Whitney, “Rapid extraction of mean emotion and gender from sets of faces,” Current Biology, vol. 17, no. 17, pp. R751–R753, Sep. 2007.

[33] W. Han, H. Li, F. Eyben, L. Ma, J. Sun, and B. Schuller, “Preserving Actual Dynamic Trend of Emotion in Dimensional Speech Emotion Recognition,” in Proceedings of the 14th ACM International Conference on Multimodal Interaction, New York, NY, USA, 2012, pp. 523–528.

[34] C. S. S. Tan, J. Schöning, K. Luyten, and K. Coninx, “Informing Intelligent User Interfaces by Inferring Affective States from Body Postures in Ubiquitous Computing Environments,” in Proceedings of the 2013 International Conference on Intelligent User Interfaces, New York, NY, USA, 2013, pp. 235–246.

[35] M. Tkalčič, A. Odić, A. Košir, and J. F. Tasič, “Impact of Implicit and Explicit Affective Labeling on a Recommender System’s Performance,” in Advances in User Modeling, L. Ardissono and T. Kuflik, Eds. Springer Berlin Heidelberg, 2011, pp. 342–354.

[36] C. Epp, M. Lippold, and R. L. Mandryk, “Identifying Emotional States Using Keystroke Dynamics,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2011, pp. 715–724.

[37] R. J. Erwin, R. C. Gur, R. E. Gur, B. Skolnick, M. Mawhinney-Hee, and J. Smailis, “Facial emotion discrimination: I. Task construction and behavioral findings in normal subjects,” Psychiatry Research, vol. 42, no. 3, pp. 231–240, Jun. 1992.

[38] S. Deng, D. Wang, X. Li, and G. Xu, “Exploring user emotion in microblogs for music recommendation,” Expert Systems with Applications, vol. 42, no. 23, pp. 9284–9293, Dec. 2015.

[39] D. A. Gómez Jáuregui and J.-C. Martin, “Evaluation of Vision-based Real-time Measures for Emotions Discrimination Under Uncontrolled Conditions,” in Proceedings of the 2013 on Emotion Recognition in the Wild Challenge and Workshop, New York, NY, USA, 2013, pp. 17–22.

[40] L. X. Liao, A. M. Corsi, P. Chrysochou, and L. Lockshin, “Emotional responses towards food packaging: A joint application of self-report and physiological measures of emotion,” Food Quality and Preference, vol. 42, pp. 48–55, Jun. 2015.

[41] K. H. Hyun, E. H. Kim, and Y. K. Kwak, “Emotional Feature Extraction Method Based on the Concentration of Phoneme Influence for Human–Robot Interaction,” Advanced Robotics, vol. 24, no. 1–2, pp. 47–67, Jan. 2010.

[42] N. J. Gogoi and P. Das, “EMOTION RECOGNITION FROM SPEECH SIGNAL: REALIZATION AND AVAILABLE TECHNIQUES,” International Journal of Engineering Science and Technology, vol. 6, no. 5, pp. 188–191, May 2014.

[43] V. H.d, A. K.r, and K. K, “Emotion Recognition from Decision Level Fusion of Visual and Acoustic Features Using Hausdorff Classifier,” in Computer Networks and Intelligent Computing, K. R. Venugopal and L. M. Patnaik, Eds. Springer Berlin Heidelberg, 2011, pp. 601–610.

[44] H. Binali and V. Potdar, “Emotion Detection State of the Art,” in Proceedings of the CUBE International Information Technology Conference, New York, NY, USA, 2012, pp. 501–507.

[45] J. R. Bellegarda, “Emotion Analysis Using Latent Affective Folding and Embedding,” in Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text, Stroudsburg, PA, USA, 2010, pp. 1–9.

[46] B. Guthier, R. Alharthi, R. Abaalkhail, and A. El Saddik, “Detection and Visualization of Emotions in an Affect-Aware City,” in Proceedings of the 1st International Workshop on Emerging Multimedia Applications and Services for Smart Cities, New York, NY, USA, 2014, pp. 23–28.

[47] S. Alghowinem, M. AlShehri, R. Goecke, and M. Wagner, “Exploring Eye Activity as an Indication of Emotional States Using an Eye-Tracking Sensor,” in Intelligent Systems for Science and Information, L. Chen, S. Kapoor, and R. Bhatia, Eds. Springer International Publishing, 2014, pp. 261–276.

[48] J.-H. Lee, Y. Hwang, K.-A. Cheon, and H.-I. Jung, “Emotion-on-a-chip (EOC): evolution of biochip technology to measure human emotion using body fluids,” Med. Hypotheses, vol. 79, no. 6, pp. 827–832, Dec. 2012.

[49] M. Ouwerkerk, F. Pasveer, and G. Langereis, “Unobtrusive Sensing of Psychophysiological Parameters,” in Probing Experience, J. H. D. M. Westerink, M. Ouwerkerk, T. J. M. Overbeek, W. F. Pasveer, and B. de Ruyter, Eds. Springer Netherlands, 2008, pp. 163–193.

[50] M. Ouwerkerk, “Unobtrusive Emotions Sensing in Daily Life,” in Sensing Emotions, J. Westerink, M. Krans, and M. Ouwerkerk, Eds. Springer Netherlands, 2010, pp. 21–39.

[51] M. Adibuzzaman, N. Jain, N. Steinhafel, M. Haque, F. Ahmed, S. I. Ahamed, and R. Love, “Towards in Situ Affect Detection in Mobile Devices: A Multimodal Approach,” in Proceedings of the 2013 Research in Adaptive and Convergent Systems, New York, NY, USA, 2013, pp. 454–460.

[52] K. S. Kassam and W. B. Mendes, “The effects of measuring emotion: physiological reactions to emotional situations depend on whether someone is asking,” PLoS ONE, vol. 8, no. 7, p. e64959, 2013.

[53] U. Graichen, R. Eichardt, P. Fiedler, D. Strohmeier, F. Zanow, and J. Haueisen, “SPHARA–a generalized spatial Fourier analysis for multi-sensor systems with non-uniformly arranged sensors: application to EEG,” PLoS ONE, vol. 10, no. 4, p. e0121741, 2015.

[54] N. K. Suryadevara, T. Quazi, and S. C. Mukhopadhyay, “Smart Sensing System for Human Emotion and Behaviour Recognition,” in Perception and Machine Intelligence, M. K. Kundu, S. Mitra, D. Mazumdar, and S. K. Pal, Eds. Springer Berlin Heidelberg, 2012, pp. 11–22.

[55] P. Paredes, D. Sun, and J. Canny, “Sensor-less Sensing for Affective Computing and Stress Management Technology,” in Proceedings of the 7th International Conference on Pervasive Computing Technologies for Healthcare, ICST, Brussels, Belgium, Belgium, 2013, pp. 459–463.

[56] K. Takahashi and I. Sugimoto, “Remarks on Emotion Recognition Using Breath Gas Sensing System,” in Smart Sensors and Sensing Technology, S. C. Mukhopadhyay and G. S. Gupta, Eds. Springer Berlin Heidelberg, 2008, pp. 49–62.

[57] M. Khezri, M. Firoozabadi, and A. R. Sharafat, “Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals,” Computer Methods and Programs in Biomedicine.

[58] K. Pollmann, “Real-Time Emotion Detection for Neuro-Adaptive Systems,” in Proceedings of the 20th International Conference on Intelligent User Interfaces Companion, New York, NY, USA, 2015, pp. 109–112.

[59] M. K. Robert Schmitt, “Objectifying User Attention and Emotion Evoked by Relevant Perceived Product Components,” Journal of Sensors and Sensor Systems, vol. 3, no. 2, 2013.

[60] L. I. Kuncheva, T. Christy, I. Pierce, and S. P. Mansoor, “Multi-modal Biometric Emotion Recognition Using Classifier Ensembles,” in Proceedings of the 24th International Conference on Industrial Engineering and Other Applications of Applied Intelligent Systems Conference on Modern Approaches in Applied Intelligence – Volume Part I, Berlin, Heidelberg, 2011, pp. 317–326.

[61] G. Valenza, A. Lanatá, and E. P. Scilingo, “Improving emotion recognition systems by embedding cardiorespiratory coupling,” Physiol Meas, vol. 34, no. 4, pp. 449–464, Apr. 2013.

[62] P. A. Vijaya and G. Shivakumar, “Galvanic Skin Response: A Physiological Sensor System for Affective Computing,” International Journal of Machine Learning and Computing, pp. 31–34, 2013.

[63] M. Omata, D. Kanuka, and X. Mao, “Experiments for Emotion Estimation from Biological Signals and Its Application,” in Transactions on Computational Science XXIII, M. L. Gavrilova, C. J. K. Tan, X. Mao, and L. Hong, Eds. Springer Berlin Heidelberg, 2014, pp. 178–198.

[64] H. A. Maior, M. Pike, S. Sharples, and M. L. Wilson, “Examining the Reliability of Using fNIRS in Realistic HCI Settings for Spatial and Verbal Tasks,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 3039–3042.

[65] S.-J. Kim, J.-S. Kim, S.-H. Kim, and Y.-M. Kim, “Evolvable Recommendation System in the Portable Device Based on the Emotion Awareness,” in Knowledge-Based Intelligent Information and Engineering Systems, R. Khosla, R. J. Howlett, and L. C. Jain, Eds. Springer Berlin Heidelberg, 2005, pp. 251–257.

[66] S. W. Gilroy, M. O. Cavazza, and V. Vervondel, “Evaluating Multimodal Affective Fusion Using Physiological Signals,” in Proceedings of the 16th International Conference on Intelligent User Interfaces, New York, NY, USA, 2011, pp. 53–62.

[67] C. Peter, R. Schultz, J. Voskamp, B. Urban, N. Nowack, H. Janik, K. Kraft, and R. Göcke, “EREC-II in Use – Studies on Usability and Suitability of a Sensor System for Affect Detection and Human Performance Monitoring,” in Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments, J. A. Jacko, Ed. Springer Berlin Heidelberg, 2007, pp. 465–474.

[68] E. Kanjo, L. Al-Husain, and A. Chamberlain, “Emotions in context: examining pervasive affective sensing systems, applications, and analyses,” Pers Ubiquit Comput, vol. 19, no. 7, pp. 1197–1212, Apr. 2015.

[69] F. H. Wilhelm and P. Grossman, “Emotions beyond the laboratory: Theoretical fundaments, study design, and analytic strategies for advanced ambulatory assessment,” Biological Psychology, vol. 84, no. 3, pp. 552–569, Jul. 2010.

[70] K. Rattanyu and M. Mizukawa, “Emotion Recognition Using Biological Signal in Intelligent Space,” in Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, J. A. Jacko, Ed. Springer Berlin Heidelberg, 2011, pp. 586–592.

[71] A. Haag, S. Goronzy, P. Schaich, and J. Williams, “Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System,” in Affective Dialogue Systems, E. André, L. Dybkjær, W. Minker, and P. Heisterkamp, Eds. Springer Berlin Heidelberg, 2004, pp. 36–48.

[72] F. Nasoz, K. Alvarez, C. L. Lisetti, and N. Finkelstein, “Emotion recognition from physiological signals using wireless sensors for presence technologies,” Cogn Tech Work, vol. 6, no. 1, pp. 4–14, Dec. 2003.

[73] M. Szwoch and W. Szwoch, “Emotion Recognition for Affect Aware Video Games,” in Image Processing & Communications Challenges 6, R. S. Choraś, Ed. Springer International Publishing, 2015, pp. 227–236.

[74] Z. Zeng, M. Pantic, and T. S. Huang, “Emotion Recognition Based on Multimodal Information,” in Affective Information Processing, D. J. Tao and P. T. T. Bs. MSc, Eds. Springer London, 2009, pp. 241–265.

[75] A. Landowska, “Emotion Monitoring – Verification of Physiological Characteristics Measurement Procedures,” Metrology and Measurement Systems, vol. 21, no. 4, 2014.

[76] C. Peter and B. Urban, “Emotion in Human-Computer Interaction,” in Expanding the Frontiers of Visual Analytics and Visualization, J. Dill, R. Earnshaw, D. Kasik, J. Vince, and P. C. Wong, Eds. Springer London, 2012, pp. 239–262.

[77] M. Mikhail, K. El-Ayat, R. El Kaliouby, J. Coan, and J. J. B. Allen, “Emotion Detection Using Noisy EEG Data,” in Proceedings of the 1st Augmented Human International Conference, New York, NY, USA, 2010, pp. 7:1–7:7.

[78] A. Martínez-Rodrigo, R. Zangróniz, J. M. Pastor, J. M. Latorre, and A. Fernández-Caballero, “Emotion Detection in Ageing Adults from Physiological Sensors,” in Ambient Intelligence – Software and Applications, A. Mohamed, P. Novais, A. Pereira, G. V. González, and A. Fernández-Caballero, Eds. Springer International Publishing, 2015, pp. 253–261.

[79] E. Treacy Solovey, D. Afergan, E. M. Peck, S. W. Hincks, and R. J. K. Jacob, “Designing Implicit Interfaces for Physiological Computing: Guidelines and Lessons Learned Using fNIRS,” ACM Trans. Comput.-Hum. Interact., vol. 21, no. 6, pp. 35:1–35:27, Jan. 2015.

[80] V. Gay and P. Leijdekkers, “Design of emotion-aware mobile apps for autistic children,” Health Technol., vol. 4, no. 1, pp. 21–26, Dec. 2013.

[81] A. Mencattini, F. Ringeval, B. Schuller, E. Martinelli, and C. Di Natale, “Continuous Monitoring of Emotions by a Multimodal Cooperative Sensor System,” Procedia Engineering, vol. 120, pp. 556–559, 2015.

[82] F. H. Wilhelm, M. C. Pfaltz, and P. Grossman, “Continuous electronic data capture of physiology, behavior and experience in real life: towards ecological momentary assessment of emotion,” Interacting with Computers, vol. 18, no. 2, pp. 171–186, Mar. 2006.

[83] S. H. Fairclough, A. J. Karran, and K. Gilleade, “Classification Accuracy from the Perspective of the User: Real-Time Interaction with Physiological Computing,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 3029–3038.

[84] C. M. Jones and T. Troen, “Biometric Valence and Arousal Recognition,” in Proceedings of the 19th Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces, New York, NY, USA, 2007, pp. 191–194.

[85] A. E. Youssef, S. F. Aly, A. S. Ibrahim, and A. L. Abbott, “Auto-Optimized Multimodal Expression Recognition Framework Using 3D Kinect Data for ASD Therapeutic Aid,” International Journal of Modeling and Optimization, pp. 112–115, 2013.

[86] V. Loseu, H. Ghasemzadeh, S. Ostadabbas, N. Raveendranathan, J. Malan, and R. Jafari, “Applications of Sensing Platforms with Wearable Computers,” in Proceedings of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments, New York, NY, USA, 2010, pp. 53:1–53:5.

[87] M. Vanny, S.-M. Park, K.-E. Ko, and K.-B. Sim, “Analysis of Physiological Signals for Emotion Recognition Based on Support Vector Machine,” in Robot Intelligence Technology and Applications 2012, J.-H. Kim, E. T. Matson, H. Myung, and P. Xu, Eds. Springer Berlin Heidelberg, 2013, pp. 115–125.

[88] L. Reinerman-Jones, G. Taylor, K. Cosenzo, and S. Lackey, “Analysis of Multiple Physiological Sensor Data,” in Foundations of Augmented Cognition. Directing the Future of Adaptive Systems, D. D. Schmorrow and C. M. Fidopiastis, Eds. Springer Berlin Heidelberg, 2011, pp. 112–119.

[89] H. Leng, Y. Lin, and L. A. Zanzi, “An Experimental Study on Physiological Parameters Toward Driver Emotion Recognition,” in Ergonomics and Health Aspects of Work with Computers, M. J. Dainoff, Ed. Springer Berlin Heidelberg, 2007, pp. 237–246.

[90] B. P. Woolf, I. Arroyo, D. Cooper, W. Burleson, and K. Muldner, “Affective Tutors: Automatic Detection of and Response to Student Emotion,” in Advances in Intelligent Tutoring Systems, R. Nkambou, J. Bourdeau, and R. Mizoguchi, Eds. Springer Berlin Heidelberg, 2010, pp. 207–227.

[91] C. Peter, E. Ebert, and H. Beikirch, “A Wearable Multi-sensor System for Mobile Acquisition of Emotion-Related Physiological Data,” in Affective Computing and Intelligent Interaction, J. Tao, T. Tan, and R. W. Picard, Eds. Springer Berlin Heidelberg, 2005, pp. 691–698.

[92] G. Rigas, C. D. Katsis, G. Ganiatsas, and D. I. Fotiadis, “A User Independent, Biosignal Based, Emotion Recognition Method,” in User Modeling 2007, C. Conati, K. McCoy, and G. Paliouras, Eds. Springer Berlin Heidelberg, 2007, pp. 314–318.

[93] C.-F. Huang and W.-P. Nien, “A Study of the Integrated Automated Emotion Music with the Motion Gesture Synthesis via ZigBee Wireless Communication,” International Journal of Distributed Sensor Networks, vol. 2013, pp. 1–9, 2013.

[94] P. A. Nogueira, R. Rodrigues, E. Oliveira, and L. E. Nacke, “A Hybrid Approach at Emotional State Detection: Merging Theoretical Models of Emotion with Data-Driven Statistical Classifiers,” in Proceedings of the 2013 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT) – Volume 02, Washington, DC, USA, 2013, pp. 253–260.

[95] J. A. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, vol. 39, no. 6, pp. 1161–1178, 1980.

[96] R. B. Knapp, J. Kim, and E. André, “Physiological Signals and Their Use in Augmenting Emotion Recognition for Human–Machine Interaction,” in Emotion-Oriented Systems, R. Cowie, C. Pelachaud, and P. Petta, Eds. Springer Berlin Heidelberg, 2011, pp. 133–159.

[97] J. Spigulis, R. Erts, V. Nikiforovs, and E. Kviesis-Kipge, “Wearable wireless photoplethysmography sensors,” 2008, vol. 6991, p. 69912O–69912O–7.

[98] T. Tamura, Y. Maeda, M. Sekine, and M. Yoshida, “Wearable Photoplethysmographic Sensors—Past and Present,” Electronics, vol. 3, no. 2, pp. 282–302, Apr. 2014.

[99] “LED Pulse Sensor (PPG) for Arduino,” Instructables.com. [Online]. Available: http://www.instructables.com/id/LED-Pulse-Sensor-PPG-for-Arduino/. [Accessed: 13-Oct-2015].

[100] S. C. Wriessnegger, A. Pinegger, and G. R. Mueller-Putz, “The Evaluation of Different EEG Sensor Technologies,” in Information Systems and Neuroscience, F. D. Davis, R. Riedl, J. vom Brocke, P.-M. Léger, and A. B. Randolph, Eds. Springer International Publishing, 2015, pp. 85–90.

[101] M.-Z. Poh, N. C. Swenson, and R. W. Picard, “A Wearable Sensor for Unobtrusive, Long-Term Assessment of Electrodermal Activity,” IEEE Transactions on Biomedical Engineering, vol. 57, no. 5, pp. 1243–1252, May 2010.

[102] S. Kang, S. Kwon, C. Yoo, S. Seo, K. Park, J. Song, and Y. Lee, “Sinabro: Opportunistic and Unobtrusive Mobile Electrocardiogram Monitoring System,” in Proceedings of the 15th Workshop on Mobile Computing Systems and Applications, New York, NY, USA, 2014, pp. 11:1–11:6.

[103] N. Belgacem and S. Boumerdassi, “Mobile Personal Electrocardiogram Monitoring System with Patient Location,” in Proceedings of the 1st ACM International Workshop on Medical-grade Wireless Networks, New York, NY, USA, 2009, pp. 69–72.

[104] S.-C. Huang, P.-H. Hung, C.-H. Hong, and H.-M. Wang, “A New Image Blood Pressure Sensor Based on PPG, RRT, BPTT, and Harmonic Balancing,” IEEE Sensors Journal, vol. 14, no. 10, pp. 3685–3692, Oct. 2014.

[105] L. Liu and J. Liu, “Biomedical sensor technologies on the platform of mobile phones,” Front. Mech. Eng., vol. 6, no. 2, pp. 160–175, Apr. 2011.

[106] B. Caramiaux, M. Donnarumma, and A. Tanaka, “Understanding Gesture Expressivity Through Muscle Sensing,” ACM Trans. Comput.-Hum. Interact., vol. 21, no. 6, pp. 31:1–31:26, Jan. 2015.

[107] K. Hinckley and H. Song, “Sensor Synaesthesia: Touch in Motion, and Motion in Touch,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2011, pp. 801–810.

[108] R. W. Picard, “Recognizing Stress, Engagement, and Positive Emotion,” in Proceedings of the 20th International Conference on Intelligent User Interfaces, New York, NY, USA, 2015, pp. 3–4.

[109] G. Wilson, G. Davidson, and S. A. Brewster, “In the Heat of the Moment: Subjective Interpretations of Thermal Feedback During Interaction,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 2063–2072.

[110] C. Zhang, A. Guo, D. Zhang, C. Southern, R. Arriaga, and G. Abowd, “BeyondTouch: Extending the Input Language with Built-in Sensors on Commodity Smartphones,” in Proceedings of the 20th International Conference on Intelligent User Interfaces, New York, NY, USA, 2015, pp. 67–77.

[111] Y. Gao, N. Bianchi-Berthouze, and H. Meng, “What Does Touch Tell Us About Emotions in Touchscreen-Based Gameplay?,” ACM Trans. Comput.-Hum. Interact., vol. 19, no. 4, pp. 31:1–31:30, Dec. 2012.

[112] G. Huisman and A. Darriba Frederiks, “Towards Tactile Expressions of Emotion Through Mediated Touch,” in CHI ’13 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2013, pp. 1575–1580.

[113] C. Coutrix and N. Mandran, “Identifying Emotions Expressed by Mobile Users Through 2D Surface and 3D Motion Gestures,” in Proceedings of the 2012 ACM Conference on Ubiquitous Computing, New York, NY, USA, 2012, pp. 311–320.

[114] M. Obrist, S. Subramanian, E. Gatti, B. Long, and T. Carter, “Emotions Mediated Through Mid-Air Haptics,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 2053–2062.

[115] F. Putze, C. Amma, and T. Schultz, “Design and Evaluation of a Self-Correcting Gesture Interface Based on Error Potentials from EEG,” in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2015, pp. 3375–3384.

[116] R. Kaiser and K. Oertel, “Emotions in HCI: An Affective e-Learning System,” in Proceedings of the HCSNet Workshop on Use of Vision in Human-computer Interaction – Volume 56, Darlinghurst, Australia, Australia, 2006, pp. 105–106.

[117] T.-C. Tsai, J.-J. Chen, and W.-C. Lo, “Design and Implementation of Mobile Personal Emotion Monitoring System,” in Proceedings of the 2009 Tenth International Conference on Mobile Data Management: Systems, Services and Middleware, Washington, DC, USA, 2009, pp. 430–435.

[118] H. Wang, H. Prendinger, and T. Igarashi, “Communicating Emotions in Online Chat Using Physiological Sensors and Animated Text,” in CHI ’04 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, 2004, pp. 1171–1174.

[119] R. Picard, “Affective Media and Wearables: Surprising Findings,” in Proceedings of the ACM International Conference on Multimedia, New York, NY, USA, 2014, pp. 3–4.

[120] S. Patel, H. Park, P. Bonato, L. Chan, and M. Rodgers, “A review of wearable sensors and systems with application in rehabilitation,” Journal of NeuroEngineering and Rehabilitation, vol. 9, no. 1, p. 21, Apr. 2012.

[121] S. Balters and M. Steinert, “Capturing emotion reactivity through physiology measurement as a foundation for affective engineering in engineering design science and engineering practices,” J Intell Manuf, pp. 1–23, Sep. 2015.

[122] E. L. van den Broek, V. Lisý, J. H. Janssen, J. H. D. M. Westerink, M. H. Schut, and K. Tuinenbreijer, “Affective Man-Machine Interface: Unveiling Human Emotions through Biosignals,” in Biomedical Engineering Systems and Technologies, A. Fred, J. Filipe, and H. Gamboa, Eds. Springer Berlin Heidelberg, 2009, pp. 21–47.

Advertisements