Computational Carnatic Music

Yuva Sangeetha Lahari

Traditional painting of Carnatic Trinity

Computational Carnatic Music

IBM’s Deep Blue defeated world chess champion Garry Kasparov in 1996, and Watson handily won Jeopardy in 2011. Watson BEAT, their newest machine, generates Western music scores and inspires musicians with creative ideas. Can the days of auto generation of Carnatic Music (hereafter referred to as CM) be far away?

Computer scientists at several institutions have been working on simple models for Carnatic music. The areas of interest include raga identification, notation, modeling gamakas, and rhythm analysis. One could wonder what the purpose is, given that it is art we are talking about, and that the level of sensitivity and skill attained by humans are probably going to be nearly impossible to model. Some of the areas where computational modeling can potentially revolutionize the fields of CM are:  building tools for efficient transcription, building notational software, musicological studies, and perhaps auto generation of CM from a skeletal notation. If we don’t end up producing attractive results in these areas, the deep study that is required to study and model various aspects of CM will at least result in a tremendous knowledge base for us.

Looking at the melodic aspect, the characteristics of CM make it very difficult for exact notation. Hence any model built to simulate, or decode a piece of Carnatic music has to be very complex. Gamakas are an essential characteristic of a raga, without which the raga loses its identity. Due to the nature of these ornamentations in CM, the notes sometimes assume slightly different frequencies (not all notes take up discrete frequency values).  Two ragas that use exactly the same ascending and descending scales with the exact same frequency definitions for notes can be very different from each other due to the difference in gamakas, ordering of notes, or specific contextual use of certain phrases. Different schools may use slightly different versions of the ornamentation. All these factors, plus the unpredictably fine human elements that make CM what it is, pose immense challenges in forming accurate algorithms for identification of notes, determination of raga, or notating a piece of music. Rhythm analysis would need models that can analyze music at multiple time scales such as the tempo, the talam, and the nadais (sub-beat structure) within a talam.

I’d like to share just one example each of computational work done in the melodic and rhythmic areas that caught my interest. In his research conducted at the National University of Singapore, Srikumar Karaikudi Subramanian studies the principles underlying gamakas of CM. The approach was to build a computational model encoding the expertise required to interpret sparse or “prescriptive” notation in CM. The basis chosen was a reference performance of our beloved Sahana varnam on the veena. To notate the kritis we learn, we use this type of “prescriptive” notation, with note names to indicate which note in the specified raga is used, the timing, and the corresponding lyrics underneath. In order to write and use this notation, we assume that the user has knowledge of what to do with these notes in a given raga. We normally don’t use extensive symbols to indicate the complex movements of notes (descriptive notation) that might render the notation unintelligible. Musicians intuitively understand gamakas but how do we make a machine learn it? This particular research builds a model that represents a gamaka in an intermediate way between prescriptive and descriptive methods. Here is a simple explanation that any Carnatic musician who knows gamakas can understand. In this work, two kinds of representation are used for gamakas, and one of them is used depending upon the context: one where there are four components – the Pitch, Attack, Sustain and Release (basically four numbers each describing a pitch value within a gamaka – I am not going into too many details here), and another a two-part representation called Stage (base pitch which could be different from the notated pitch) and Dance (residual movement after starting at the base pitch) in order to parse it. As an example of the second one, in Sahana, to play the note Ri, one way is to start at Sa and move up to dancing between Ri and Ga.

We must remember that this research was based on one performer, one composition and one style, to begin with. The complexity obviously would be much more, if we consider how all these are variables in CM. It is to be noted that author is a trained Carnatic vainika, and puts his knowledge of gamakas based on veena techniques to extensive use in this project.

Meanwhile at Georgia Tech, a group of four researchers Ajay Srinivasamurthy, Sidharth Subramanian, Gregoire Tronel, and Parag Chordia developed a beat tracking algorithm to describe rhythm in Indian Classical music. The tempo, beat locations, and nadai resulting from this analysis are ranked, and tested against a manually annotated CM dataset. The algorithm uses a beat similarity matrix and interval histogram to automatically extract the sub beat structure and the long-term periodicity of a musical piece. They have achieved about 75% accuracy based on their method, which is a pretty robust result.

Projects such as the above are bound to do better if the researcher is equipped with a good amount of expertise in the intricacies of the music. I hope that the active bright young minds in the Carnatic scene today, many of who are qualified to pursue computational research, would take up similar interesting projects that would produce meaningful results for musicological
studies in CM.

                                                                                                                                                                    Rajeswari Satish

Bibliography

Chordia, Parag, and Sertan Şentürk. 2013. “Joint Recognition of Raag and Tonic in North   Indian Music.” Computer Music Journal 37, no. 3: 82-98.

CompMusic. 2012. “A Two Component Representation for Modeling Gamakas of   Carnatic Music”, Published July 26, 2012, YouTube Video, 14:41,      https://www.youtube.com/watch?v=r7jHZ4rZTMM

CompMusic. 2012. “Generating Computer Music from Skeletal Notation for Carnatic  Music Compositions”, Published July 25, 2012, YouTube Video, 18:46,    https://www.youtube.com/watch?v=w1otlVzoK0E

Daniel, Hannah, and A. Revathi. 2015.”Raga Identification of Carnatic Music Using     Iterative Clustering Approach.” Computing and Communications Technologies  (ICCCT), 2015 International Conference on, 19-24.

Koduri, Gopala Krishna, et al. 2011. “Computational Approaches for the Understanding   of Melody in Carnatic  Music”, International Society for Music Information   Retrieval (ISMIR), Universitat Pompeu Fabra.

Krishna TM and Ishwar V. 2012. “Carnatic music: Svara, Gamaka, Motif and Raga   identity”. In: Serra X, Rao P, Murthy H, Bozkurt B, editors. Proceedings of the  2ndCompMusic Workshop; 2012 Jul 12-13; Istanbul, Turkey.
Barcelona: Universitat Pompeu Fabra, 12-18.

Rajeswari Sridhar; T.V. Geetha. 2013.”Raga Identification of Carnatic Music Based on   the Construction of Raga Model.” Int. J. of Signal and Imaging Systems  Engineering 6, no. 3: 172.

Sridhar, Rajeswari, et al. 2011. “Latent Dirichlet Allocation Model for Raga   Identification of Carnatic Music. (Report).” Journal of Computer Science 7, no.  11: 1711.

Srinivasamurthy, Ajay, et al. 2012. “A Beat Tracking Approach To Complete Description of Rhythm in Indian Classical Music” Proceedings of the 2ndCompMusic  Workshop; 2012 Jul 12-13; Istanbul, Turkey.

Subramanian, Srikumar Karaikudi. 2013. “Modeling Gamak ̄as of Carnatic Music as a  Synthesizer for Sparse Prescriptive Notation”. PhD Dissertation, National University of Singapore.

Vijayakrishnan, K. 2009. “The Function and Scope of Notation in Carnatic  Music.” Journal of the Indian Musicological Society 40: 140-272.