Research, sonification, User interfaces

Week 3 Update: I see what you mean/ I hear what you say

This week I will be discussing the main themes of what I have been researching in an attempt to narrow down my search. Once I have done this, hopefully I will be able to refine my title and create a more comprehensive search of the plethora of research papers out there regarding sonification.

Screen Shot 2017-06-01 at 01.32.33
Some light bedtime reading

Sonification as a data display tool

Why is it that our culture is so image obsessed? (I’m not talking about what you might think I’m talking about here).

Imagine you’re trying to explain a new concept to someone who knows little or nothing about it.. when they finally get what you’re talking about (depending on how good you are at explaining things ) they might say something along the lines of:

“Ooh.. I see what you mean now.”

But why should they see what you mean? Rather than hear what you say? When discussing new concepts, our initial ideas our usually visualized first. Some other common visualising phrases that have infiltrated our modern vernacular include:

“Let me look into that for you”

“Our company image reflects that of…”

“I have a vision for this project about sonification”

But what if there was another way to think about things? Or more importantly to this research.. what if there was a way to hear things?

Sonification has been around for a while now and there have been lots of useful applications of this. Here is an example of one very interesting TED talk which talks about the use of sonification in particle physics with data from the Large Hadron Collider:

This is just an example of one of the many ways which we might be able to use sonification to enrich our lives and aid our research. In this talk, Asquith’s sonification is using a technique known as parameter mapping [1]. However, I would like to further research areas of sonification which allows the user more interaction with the data than simply listening to it. Another form of sonification occurs in interactive sonification [1], where the user is allowed to interact with the data. This utilizes research from both the field of human-computer interaction and auditory data analysis. Specifically, I would like to look at the use of interfaces in such a system.

Tangible Audio Interfaces

This week I’ve been looking at some novel interfaces for musical exploration in an attempt to consider how to improve the field of user interaction in interactive sonification. One particularly interesting paper I looked at was making a tangible user interface called Musical Trinkets [2]. This was using passive resonant electromagnetically-coupled tags to create a musical controller from various odd-bits.

musical_trinkets
Paradiso and Hsiao’s Musical Trinkets: http://resenv.media.mit.edu/sweptRF.html

This could have interesting applications to a sonification system, where each “trinket” could somehow been related to a parameter in a dataset, or could be used to further explore data through the tactile interface. Over the next few weeks I will be exploring further some tangible audio interfaces.

Spatial Audio

Another interesting concept for sonification is the use of spatial audio. Spatial datasets can be spatialised for both spatial and non-spatial audio and similarly, non-spatial datasets can be for spatial and non-spatial audio [3].

Use of Head-Related Transfer Functions (HRTFs) containing information about interaural time and level differences (ITD and ILD) can be used to give a sound both azimuthal and elevation as seen below:

azimuthal_elevScreen Shot 2017-06-01 at 10.11.00

 

Using spatial sound can be a useful tool for sonification as it allows us to explore the data in new and exciting ways. This work can also then tie together the concepts of spatial audio and user interaction/interfacing.

All of this has left me with a lot of open doors, so join me again next Friday morning where my task is to try and hone the project in a bit further and get a bit more specific, with a view to creating a title for the project.

Let me know in the comments what you think of this week’s work ! See you next week, where I will be looking further into interactive sonification and user interaction.

References 

[1] T. Hermann and A. Hunt. “Guest editors’ introduction: An introduction to interactive sonification.” IEEE multimedia 12.2 (2005): pp.20-24.

[2] J. A. Paradiso et al. “Tangible music interfaces using passive magnetic tags.” Proceedings of the 2001 conference on New interfaces for musical expression. National University of Singapore, 2001.

[3] T. Nasir and J. C. Roberts. “Sonification of spatial data.” Georgia Institute of Technology, 2007.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s