Research, sonification, User interfaces

Week 2 Update: The Beginning of the Beginning.

After last term’s deadlines have now finally been completed, this week it was time to move on to the next stages of the MSc course: Project Development. On Thursday of last week I had an initial discussion with my academic supervisor to get the ball rolling, outlining the basics of the theory and background of the research that I will be conducting. Now it is up to me to go away and do some research! Although I have done some elements of research before in my previous degree it was to a lesser extent and this project will require high levels of project management and personal organisation.

Sonification

In my first supervision, we initially discussed one of the main premises that would underpin the work: sonification. None of the modules that I have studied so far in the course have studied sonification in any depth (although some have covered topics that will be relevant/transferrable) and so this week has been spent with the aim of learning more about sonification, what it does, how it works and what its potential uses might be.

Sonification is defined as the method of displaying data sonically, as an alternative to visualization where data is portrayed visually. Kramer [1] defined this more precisely to be:

“The use of  non-speech audio to convey information. More specifically, sonification is the transformation of data relations into perceived relations in an acoustic signal for the purposes of facilitating communication or interpretation.”

(Kramer, 1993)

It seems that sound can actually be a useful way of displaying data as many different properties can be modified to display change (e.g. frequency, amplitude, pitch) which will help the listener picture trends or correlation in the data. Below shows an example of a demonstration of sonification: this example turns the graphical representation of movement data into sound [2].

The motion is tracked using the jamoma module in Max/MSP and is then translated into sound. This is an interesting example as we can easily see how this corresponds to its visual counterpart. It also has interesting applications for those with visual impairment.

One of the areas of sonification research that was discussed in this preliminary meeting was the sonicules project  which looks at using sonification in chemical drug design.

Screen Shot 2017-05-01 at 23.27.08

This is an exciting and novel application of sonification. Displaying the enzymes’ binding sites are generally visually complex due to the convoluted nature of the molecule. Sonification can thus be an important tool to help rectify this. Another interesting aspect that can coincide with sonification is the ability to hear 3D sounds in space where it might be harder to do this through visualisation. This area might be something I would want to consider in my research, given the knowledge that I have acquired on rendering spatial audio using HRTFs (head-related transfer functions) which are used to characterise how an ear receives a sound from a point in space.

This also then led onto the idea of how one might create interactive user experiences in a sonification system, such as the sonicules project. Below shows an example of a tangible user interface that was built using magnetic tags. Creating something similar but for use in interaction with sonification could be an interesting idea to begin with? Next week, I will look at this form of interaction in more depth.

Where to next?

All of this background research that I have done has left me with a lot of intriguing ideas to investigate. I will be updating this blog on Friday mornings every week with my new findings and ideas and how the project is developing, so keep an eye out for that!

Let me know in the comments what your thoughts are on sonification and if you can think of any data that might be useful to sonify. See you next week!

References

[1] G. Kramer,  Auditory display: Sonification, audification, and auditory interfaces. New York, NY: Perseus Publishing, 1993.

[2] A. R. Jensenius. Motion-sound interaction using sonification based on motiongrams. In Proceedings of ACHI 2012: The Fifth International Conference on Advances in Computer-Human Interactions, pages 170–175. IARIA, 2012.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s