Traditional implementations of sound and music in interactive contexts have their limitations. One way to overcome these and to expand the possibilities of music is to handle the music in a parameterised form. To better understand the properties of the musical parameters resulting from parameterisation, two experiments were carried out. The first experiment investigated selected parameters’ capability to change the music; the second experiment examined how the parameters can contribute to express emotions. From these experiments, it is concluded that users without musical training perform differently from musicians on some of the parameters. There is also a clear association between the parameters and the expressed basic emotions. The paper is concluded with observations on how parameterisation might be used in interactive applications.