You are here

Communication between Visual Network and Sounds

Tentative possibilities in order of increasing challenge


 


.

.

Map (applet) issues

HTML Form / CGI / Javascript issues

Koan sound

1

Music unlinked to data or map movement

a

No problem; not required

Movement unconditioned by music

.

A single (generative) music file attached to a whole site

b

No problem; not required

Movement unconditioned by music

.

A single (generative) music file attached to a static page

c

No problem; but not very interesting if the selection is purely an aesthetic choice, like in the silent movies, rather than one directly determined by the data

Movement unconditioned by music

.

A (generative) music file attached to a dynamic page but selected from a library according to the characteristics of the data on that page

d

No problem, less interesting, and back to aesthetic choices

Movement unconditioned by music; but user can modify music associated with map by clicking on map nodes to switch to an alternative musical piece

Use map nodes as hotspots. Could use sound files that it is already possible to associate with map nodes

(Generative) music files from a library associated dynamically with different schematic features on the page (in our case hot spots in an applet spring map display of the data). WAV files could be used (149), even one per voice.

e

Music adjusted experimentally by user

Movement unconditioned by music; but user can modify music associated with map by clcking on form buttons


To pass parameters in response to user actions, timer events etc. use JavaScript in the web page, which can be as simple as:
// Set tempo in the playing piece to be 100 BPM KoanObj.KoanSetPieceParameter(0, "Tempo", "100")
// Set scale in the first voice in the playing piece to use the Major Scale KoanObj.KoanObjectParameterSet(0, "Voice", 1, "Scale Rule", "Major")
Note that of course, the objects/parameters to be affected and the values to set, can all be specified by the CGI which creates the user-side JavaScript.

As above but subject to user modification (pressing buttons, etc).
Parameters of interest include: Patches, Drums, Voice types (37, 77), Mutation, Phrase (length, gaps: 119), Sustain (Damper/hold: 83, 142), Reverb (91), Chorus (83), Tempo (142), Patch change (113), Muting / Soloing (31, 78), Scale Rule (138)

f

User choice of piece

As above

Use SELECT in form to choose a musical piece that will start playing when form is (re)submitted. Typically a .skd file.

User selects amongst alternative musical pieces to play and is then free to modify as above

2

Music driven by data (but unlinked to any map movement)

a

Simple data patterning to condition music

Parameters associated with nodes (but not attached to applet data) are used to condition music

Parameters passed as above.

As above but such that parameters are specified initially by cgi to modify playing of a set musical piece brought up with any map

b

Data patterns music

As above; but parameters are used as basis of pattern for generated music

An algorithm (based on node relationships) is used to define ("seed") Patterns (115) with the FixedPattern Voice type, and Meter and Mutation parameters (21). Alternative patterns (based on different algorithms) might be accessible from a Pattern List/Group (71) for user selection. Map relationships might be (pre)defined in terms of a particular Follow Strategy (89), allocation of Patches to Voices, Patch change (113), Voice rules (22), Forcing a Voice to play (31)

As above, but based on a pattern defined by the data

 

.

.

.

.

3

Map movement driven by music (through common data)

a

Music "plays" the map

Appropriate applet parameters need to be exposed (to Javascript) such that changes in the music affect the movement of specific parts of the map display. Specific applet nodes need to be "randomized" or differentially moved (as with a mouse drag of them) by a matching musical note.

This is a matter of defining the appropriate algorithm, that determines both the patches to use and the notes to play : which then feeds dynamically into the JavaScript (as illustrated below).
Some note (or other values) need to be attached by the cgi as subparameters to the node specifying params in the applet. In effect the musical note substitutes for a mouse drag (and release) of a certain amount.

Music associated with the page conditions differential movements of the elements of the spring map with which certain values have been associated. The map responds dynamically to the music.

 

.

.

.

.

4

Music driven by map movement (through common data)

a

Data driven "harp"

Possibility of having an invisible grid over the applet display, where the grid lines are effectively strings capable of sounding different notes. The elements of the spring map play these strings/notes whencapable of sounding different notes. The elements of the spring map play these strings/notes when they cross them as they continue to move.

This would be implemented a similar manner to the above. Here is an example:
// Force voice #3 in playing file to play a specific note, immediately, for a duration of one beat. We do this by simply by playing a pattern of only one note duration.
lNote = "3";
// The note pitch, assigned from the applet in some manner: KoanObj.KoanFunction("KoanForcePattern", 0, 2, "<60 " + lNote +")

.

b

User modfication of data driven "harp"

.

This would again be achieved similarly to the above. Here is the example that would achieve this:
// Set the instrument for voice #2 to be e.g. MIDI church organ. Note, this could of course instead use e.g. a MP3 sample by using the appropriate parameter syntax.
lInstrument = "20"; KoanObj.KoanObjectParameterSet(0, "Voice", 2, "Patch", lInstrument)
// ... and now play the required note.
lNote = "3"; // Note pitch assigned from the applet in some manner. KoanObj.KoanFunction("KoanForcePattern", 0, 2, "<60 " + lNote + ")

As above, where the user can change the notes or instruments associated with particular grid strings, and the separation of the strings in the grid.

5

Map movement driven interactively by music (AND vice versa)

a

.

Differential movement driven by polling exposed applet subparams modified by music

Use a JavaScript timer to trigger the notes to play in a sequence, but this solution is quite complicated and not very elegant. It is, however, the method Koan have used to implement their Freeharps. The way they intend to address this in the next few weeks is to be by a call similar to the following:
// Force voice #2 in playing file to play a specified note sequence, immediately. NB : the pattern string is in Koan Pattern parameter format, and defines both note durations and pitches:
KoanObj.KoanFunction("KoanForcePattern", 0, 2, "<60 1 60 2 60 3 30 4 30 5")

Parameters associated with hotspots in a data display that can trigger playing of different sound sequences, where the parameters are defined in the cgi script by the characteristics of the data

6

Map movement driven via user (micro) input

a

"Whistling to the map"

As above, except the user plays external music which conditions the movement (perhaps using something like the Koan freemixer) or even via microphone

Again, now that we have illustrated the underlying mechanism, this is a matter of your determining an appropriate algorithm to choose which you then use to drive the KoanAPI appropriately.

.

b

"Musical command language"

Map extension / reduction through musical commands substituting for current applet keystroke commands

.

.

 

.

 

.

.

7

Strategic coordination (possibilities to be explored)

a

"Chladni effects"

 

.

.

b

"Bingo control"

 

.

.

c

"Harmonization"

.

.

Use of alternative rules of harmony to explore challenges of coordination

d

"Circular score"

 

.

.

 

.

.

.

.