Smart human-machine interaction as a global challenge connects electronics and computer science with other disciplines such as psychology, sociology and the arts. Digital Musical Instruments (DMI) require the accurate capture of performersâ real-time expressive actions and the mapping of these onto sound production parameters. These demands make DMIs very good test beds for novel human-machine interaction technologies. For instance, there is an expectation from musicians to have an electronic orchestra in their hands, and perform music not only by hearing, seeing but also touching and feeling. In collaboration with the Department of Music (Dr Richard Polfreman), this project is to build a smart musical instrument using microsystems and sensor systems (such as micro-sensors, embedded system, wireless, 3D printing, wearable electronics, etc.) to trial novel human-machine interaction methods.
If you are interested in the project, please contact J.Yan@soton.ac.uk