There are many chances of accessing music on computers, due to the popularization of online music distribution and portable music players. This page introduces the studies related to music.


We use pictures and music in our daily lives as the situation demands. For example, when using a mobile phone, you may set a different calling image or a ringtone when you receive a call from a particular person.
We suggest the automatic arrangement system which combines the pictures and melodies in an appropriate way. This is called DIVA (Digital Image Varies Arrangement), which connects the melody to the picture that have similar impressions. DIVA automatically composes a piece of music from the rhythmic patterns selected by the color distribution and keywords attached to the images. This enables to provide the music arranges to go with any picture.


As we have a greater opportunity to carry massive music data using the devices like portable MP3 players, the selections of music have been diversified. There's still plenty of room for creativity to develop a display list so-called "content browser" to show what kind of music you are carrying.
This diagram is an example of the content browser which shows the list of music data by a set of icons stacked and displayed in 3D. The browser uses a visualization method called "HEIAYKYO VIEW" extended to 3D.
A problem of the existing 3D content browser was that the icons in the back are hidden bihind the ones in front, which is called "clattering". To avoid clattering, we have developed a method which enables to lay out the icons to eliminate overlap with others.


The icons of music files only have their file extensions, they would not express the genre or the impression of music. We thought that it would be much fun if the music files are listed in different icons according to its impressions.
We suggest the automatic selection system called MIST (Music Icon Selector Technique), which selects the icon image based on the characteristics of music and pictures. MIST extracts the features of both the music and the picture, and expresses each impression in adjectives. Then it combines the music and the picture which are considered to have similar impressions, and it automatically selects the icon which suits the music.
We have proposed the system named 'Lyricon' which represents the contents or stories of hit music by several icon images based on the lyrics.
In Lyricon, lyrics are divided into blocks and the morphological analysis is applied, and the icon in relation to each block's best expressed word is chosen. the combination of selected icons can express the contents of the music.


Other than using icons, there are a couple of ways to express music visually. We propose the system 'MusCat' that generates abstract images based on the amount of characteristic of the music.
MusCat applies clustering to many pieces of music and auto-generates abstract images corresponding to each cluster based on amount of characteristic.
After that, applying the zoom image-viewing system 'CAT' which is also our development, a list of the abstract images are displayed and zooming in one of the images can enable to select the desired music.


The objective of the music information visualization includes visualizing internal constitution of one piece of music, in addition to listing many pieces of music. This visualization method is valuable for assistance of composition and arrengement of music, practiciing musical instruments and conducting, and education for beginners.
We have proposed the system called 'Colorscore' that can visualize and summarize the full score. Colorscore extracts some compositional patterns of the score and shows in different colors so that it can visualize the internal constitution of the music. Colorscore is also capable of summurizing and displaying the music by the compressed representation of the visualized result in both cross and longitudinal directions.
In addition, we have another system 'RoughNote' which accepts CD and MP3 sound input, whereas Colorscore uses MIDI files. In 'Roughnote', the characteristics of a piece of music are described as colored spheres, and the changes in music can be 'roughly' expressed by the sequence of spheres.


It is often the case that when people choose a piece of music, they want to select one for the situation like 'music for driving' or 'music at the seaside'.. not specifying exact names of the musicians or tunes.
In these situations the criteria of the selections could be very subjective and greatly different in indivisuals. To satisfy the needs, we propose a music recommendation system 'MusiCube', in which computers properly learn users' preferences. Users are given pieces of music by MusiCube which applies Interactive Evolutionary Computation, and they are to decide whether the given tunes match to their purposes. Repeating the operations, MusiCube learns the preferences of users and the users can check the system's learning process and distribution of the given tunes with deployed icons in a square space where two features are assigned to X- and Y- axes. MusiCube is not only the music recommendation system, but also the system which creates users awareness about characteristics of the tunes they selected.


It is worth categorizing massive music data according to the implications of music.
We have categorized many pieces of popular music by chord progress, then visualize the relationship between the categorization and the other related information such as artist's name, date, and demography of listeners. This makes us possible to analyze music in a simple manner such as the preference of a certain listener and popular chord progress of a historical period.
This study is a joint research with Watanabe laboratory in Ochanomizu University.