Visualisation for Articulation for the Hearing Impaired with Self Organising Maps
This MSc project focuses on place of articulation for a hearing impaired person. Most of the work is based on building an efficient and attractive user interface to explore the use of Kohonen’s Self-Organising-Maps in visualising the phonemic trajectory. Several experiments were done with a hearing-impaired person and normal speakers and the results were analysed and visualised with the VAHISOM program.
In parallel with the objectives of that project the user of that software can also explore the use of Kohonen’s Self-Organising-Maps and measure how well they can fit to a similar problem. Thus this makes also VAHISOM a useful tutorial aid.
The VAHISOM program is a graphical user interface for the SOM_PAK software. It is implemented in C programming language using a graphics library in C (libsx) and it runs on Unix systems. The provided buttons and the design of the display area allow the user to handle efficiently the testing of an input on the map and visualize effectively the clustering. The results from the classification are automatically produced from the program in a very analytical form and the user can derive important conclusions.
The primary objective of the project was to visualise and compare the speech trajectories of the hearing impaired and the normal speaker.