Press "Enter" to skip to content

Tag: CSound

Max Mathews and Richard Boulanger Live from 1992

Max Mathews is the father of computer music who worked at bell labs in the early 50s. The radio drum is a realtime performance instrument developed by him in the last 20 years. Composer, performer, educator Richard Boulanger is the author of the definitive book on Csound. they both are playing the radio drum in concert in bryan recital hall at bowling green state university on march 16, 1992. This video is a complete concert, also featuring Burton Beerman-clarinet, Maureen Chowning-soprano voice, Celesta haraszti-dance. [1]



Noli me tangere (Variation 1) by Roberto Doati

In this work the link between video and music is based on the idea of contrast. There are 1 Tema and 4 Variations and the tension and the rigidity of the sounds and structures, different in each of the five parts, produce a different perception of the sensuousness, softness and fluidity of the images: falling silk fabrics.

The video editing follows, with a certain degree of freedom, the pitch series of Anton Webern Variations op. 30. By chance I brought 12 fabrics of different colours, but it was enough for me to choose the comfortable guide of musical serialism.
Another important link between video and music is represented here by the software environment I have used. The EyesWeb program tracks several parameters from the video, such as hue, saturation, brightness, center of gravity, etc., and map them to Csound patch for signal processing in real-time. Only the Tema is composed in a traditional way, i.e. editing sounds on the video track according pure musical inspiration. [1]



Fon by Iván Fernández La Banca

Poorly translated automatically:

Composed entirely using Csound and an audio file, the work combines the resínthesis of psvoc, with the technique of granulation of Cmask. Pelleting involves the definition of grains that in general, it is understood, are time small (up to about 50 milliseconds), but i use rather the concept of grain and extend the duration to around seven seconds. Each grain here is a particular form of the resínthesized voice. [1]



Graph Layout Music by Abram Hindle

Graph layouts use spring layout algorithms. Springs and other physical systems are very interesting systems and have fundamentally musical properties such as decay and oscillation.

I took the graph drawer found here and stole the state of the graph every couple of frames: http://js1k.com/demo/618

The instrument I used in the background is an additive string I modified from here: http://csoundblog.com/2010/03/bowed-string-additive-synth/

This is a web based UI being rendered by csound: http://csounds.com/

The source code for the whole shebang is at http://github.com/abramhindle/mongrel2-musical-relay http://www.youtube.com/watch?v=JN17PkR63I4



Cloth-based Computer Music

This is such a bizarre, unique little project that I had to post it. The sound in the video is really quiet, so turn your speakers up.

Each point on the clothe is responsible for one sine wave instrument. It’s position and movement cause the pitch of that particular instrument to change. The counter at the bottom is how many xmlHttpRequests have been sent. The cloth demo was stolen from http://js1k.com/demo/434 and http://www.andrew-hoyer.com/experiments/cloth Source code: http://github.com/abramhindle/mongrel2-musical-relay. [1]



Lissajous Figures 2

This video starts with two sine waves an octave apart (the sideways figure 8). Harmonics are added above the octave and phases resulting in symmetric figures are shown. First the patterns are shown using sine waves (curved edges) and then the same patterns are shown with triangle waves (straight edges). After 6 individual harmonics are shown all 6 the harmonics are added at the same time. This is then repeated with triangle waves. [1]



It’s a shame that the author of this video failed to include any attribution for this work.

Click for Details by Alessandro Perini

The core of the work is a looped 4-channels electronic music track, entirely produced using a single impulse (mathematically a Dirac delta, also called “click” or “glitch”) as the only source for the whole piece. The click has been processed exclusively with reverberation and filters in Csound. The Csound opcode used to generate the click is mpulse. The constraint of limiting to the extreme minumum the source material calls the listener for a recognition of the narrowest semiotic spaces, where the slightest distinctive feature is pertinent. Dealing with the opposition between global and local, the musical development extracts, from time to time, from the overall mass, the specific characteristics of each single individual, claiming its right to uniqueness…

From a merely technical point of view, the amplitude values of the four audio channels are used to control in real time the brightness of four clusters of three LEDs each. To do so, a Max/MSP patch converts the amplitude of the sound signal to DMX values and sends them to a USB DMX PRO console via Open Light Architecture. Data are then transmitted to a wireless DMX dimmer which feeds the LEDs. The lamps, pointing at the floor and and at the walls, are arranged on the ceiling, and each cluster is situated near one of the four speakers which also are hanging from the ceiling, in order to achieve the unity of sound and light source (two subwoofers are placed on the ground, being position-independent). [1]



More by Alessandro Perini.

The Melting Sun by Seiya Matsumiya

The Melting Sun is an ambient composition in the Bohlen-Pierce scale, whose tonality, timber, volume, and timing are determined algorithmically from a video of the sunset.

The sounds heard can be separated into two groups: the drones, and the melodies. Both groups feature three different Csound instruments that each correspond to various types of Red, Green, or Blue values extracted from the video. These data, combined with the data gathered from the position of the sun, control various parameters of the composition. Some of the data mapping choices are arbitrary, and some are obvious (i.e. the overall brightness controls the cutoff frequency of the global filter for the drones).

The composition is in the Moll II mode of the Bohlen-Pierce scale. The note numbers used for the drones and the melodies are predetermined, but the base frequency of the scale is not. In fact, the base frequency, or the tonality of the composition, shifts continuously throughout the piece with sun’s position, but the process is too slow to be perceptible—just like the movement of the sun itself. The three melodic instruments actually play the same long loop of notes, but at different timings and also in different tritaves. The timing itself changes continuously, and as the sun comes lower in the sky and causes an illusion that it is gaining speed, the notes are played more frequently. The composition currently uses previously recorded video material, but in the future it will allow the use of a visual live feed of the sunset. [1]



[1] http://bohlen-pierce-conference.org/compositions/matsumiya-the-melting-sun