Category Archives: misc

Weaving code: learning computer programming through pattern and craft

I’m leading a new collaborative project, “Weaving code:learning computer programming through pattern and craft”, with Becky Parry, Kia Ng, and the good folks from LoveBytes and ArtBoat. Ellen Harlizius-Klück and Dave Griffiths are advising as project partners, and Chris Carr will advise on progress too. Here’s the introduction from the proposal:

There is national policy drive to teach computer programming in schools. However, there is a disconnect between programming, and socially-situated learning through play. Our research will bridge this gap, recognising the needs of people, particularly of children, to engage with the social and tangible in order to understand the abstract. Our core aim is to bring pattern making in weaving, together with pattern making in live coding of music, in a pedagogic context. This will ground abstract thinking in social activities, as a springboard for learning. We will reconnect computer programming with its origins in craft, drawing from the inspiration which Babbage and Lovelace took from the Jacquard loom, as well as the development of formal mathematics in Greek antiquity using loom metaphors.

Our first step will be a visit to Masson Mills working textile museum, should be an inspiring trip. This ignite funding came through the Cultural and Creative Industries Exchange in the University of Leeds, and will hopefully feed into bigger things.

Extending human ability

I don’t always enjoy praise, but it’s really great when commentators see through the cold reality of live coding or algorave and get at the promise that motivates what we’re doing.

Here’s a comment by a reddit user called Tekmo from a few months back, which I think is about the promise of a more embodied approach to the practice of programming:

I think the entire premise of this project is really brilliant. Right now it’s probably not immediately inspiring because it takes a minute or so to switch between patterns for an average user, but imagine somebody getting REALLY good at improving on this, with their own custom library of one or two-letter function names and performing by constantly improvising patterns every few seconds while programming at lightning speed.

But the real reason I think this is brilliant is because this is sort of what I always imagined programming was about: extending human ability. I feel like the super-heroes of the future will be programmers that command an impressive array of remote machinery as if it were an extension of their own body.

Here’s an excerpt from a nice blog post by DuBose Cole which to me hints at a cultural tipping point when more people start programming:

Events like Algorave highlight that by making more people creators through programming, we don’t just get new technical creations, but social and cultural ones as well. Algorave features electronic music created by algorithms programmed on the fly for a crowd. Revellers seem to attend due to either an interest in how the music is created, a particular love of electronic music, or just to have a party. An idea like Algorave takes the image of coding as a solitary experience and moves it forward, making the programmer a collaborative and immediate creator, as well as bit of a rock star.

What the idea highlights however, is that learning to create with code is less about the skill itself and more about what you do with it. Pushing coding literacy is only the beginning. Coders are creating an ever expanding culture of creation, which anyone with a basic appreciation or skill for programming can join in with. The increasing simplicity with which people can learn coding has not only changed who can create, but also the scope of what’s being created.

Brilliant stuff.

Texture version 2.0 pre alpha

During my residency period, I’m rewriting “Texture”, the visual front-end for Tidal I started making way back in the closing moments of my PhD. The first step is to re-implement Texture in Haskell — before it was written in C, and spat out code that was then piped into the Haskell interpreter, which was a bit nuts. I’m taking a bricolage approach so don’t have a clear plan, but have a rudimentary interface starting to work:

As before, the idea is that values are applied to the closest function with a compatible type signature. I’ve still had to ‘reimplement’ the Haskell type system in itself to some extent. While I could get Haskell to tell me whether a value could be type-compatible with a function, it seems that this is not enough. This is because in practice, it is very likely that things will be type compatible, and the real constraints come with the presence of type class instances. Or something like that.

My next step is where the real point of this rewriting exercise comes in – visualisation of patterns as they are passed through a tree of transformations. I’m not sure exactly how this is going to look, but because this is all about visualising higher order functions of time and not streams of data, it’s going to be something quite a bit different from dataflow; it’ll be able to include past and future values in the visualisation without any buffering.

The (currently useless) code is available here, under the GPLv3 license.

Workshop: Drawing, Weaving, and Speaking Live Generative Music

Some more details about my workshops coming up in Hangar Barcelona. Signup here.

This workshop will explore alternative strategies for creating live sound and music. We will make connections between generative code and our perception of music, using metaphors of speech, knitting and shape, and playing with code as material. We will take a fresh look at generative systems, not through formal understanding but just by trying things out.
Through the workshops, we will work up through the layers of generative code. We will take a side look at symbols, inventing alphabets and drawing sound. We will string symbols together into words, exploring their musical properties, and how they can be interpreted by computers. We will weave words into the patterns of language, as live generation and transformation of musical patterns. We will learn how generative code is like musical notation, and how one can come up with live coding environments that are more like graphical scores.

We will visit systems like Python, Supercollider, Haskell, OpenFrameworks, Processing, OpenCV and experiment as well with more esoteric interfaces.

Schedule:

Session #01
Symbols – This first session will deal with topics such as sound symbology, mental imagery, perception and invented alphabets. We will try out different ways to draw sounds, map properties of shape to properties of sound using computer vision (“acid sketching”,https://vimeo.com/7492566), and draw lines through a sound space created from microphone input. This will allow us to get a feel for the real difference between analogue and digital, how they support each other, and how they relate to human perception and generative music.

Session #02
Words – Some more talk about strings of symbols as words, being articulations or movements, and relate expression in speech (prosody) with expression in generative music. We will experiment with stringing sequences of drawn sounds together, inventing new “onomatopoeic” words. We will look at examples of musical traditions which relate words with sounds (ancient Scottish Canntaireachd, chanting the bagpipes), and also try out vocable synthesis (http://slub.org/world orhttp://oldproject.arnolfini.org.uk/projects/2008/babble/), which works like speech synthesis but uses words to describe articulations of a musical instrument.

Session #03
Language – This session will explore the historical and metaphorical connections between knitting and computation, and between code and pattern. After some in depth talk about live coding, and the problems and opportunities it presents, we’ll spend some time exploring Tidal, a simple live coding language for musical pattern, and understand it using the metaphor of knitting with time.
Tidal: http://yaxu.org/demonstrating-tidal/

Session #04
Notation – Here we will look at the relationship between language and shape, and a range of visual programming languages. We will try out Texture, a visual front-end for Tidal, and try out some ways of controlling it with computer vision, that create feedback loops through body and code.
Texture: http://yaxu.org/category/texture/

Session #05
Final presentation and workshop wrap up.

Level: Introductory/intermediate. Prior programming experience is not required, but participants will need to bring a laptop (preferably a PC, or a Mac able to boot off a DVD), an external webcam and a pair of headphones.

Language: English

Tutor: Alex McLean

Alex McLean is a live coder, software artist and researcher based in Sheffield UK. He is one third of the live coding group Slub, getting crowds to dance to algorithms at festivals across Europe. He promotes anthropocentric technology as co-founder of the ChordPunch record label, of event promoters Algorave, the TOPLAP live coding network and the Dorkbot electronic art meetings in Sheffield and London. Alex is a research fellow in Human/Technology Interface within the Interdisciplinary Centre for Scientific Research in Music, University of Leeds.

http://yaxu.org/ ]
http://slub.org/ ]
http://algorave.com/ ]
http://chordpunch.com/ ]
http://toplap.org/ ]
http://icsrim.org.uk/ ]
http://music.leeds.ac.uk/people/alex-mclean/ ]

Dates:
Tuesday 23.07.2013, 17:00-21:00h
Thursday 25.07.2013, 17:00-21:00h
Saturday 27.07.2013, 12:00-18:00h
Monday 29.07.2013, 17:00-21:00h
Wednesday 31.07.2013, 17:00-21:00h

Location: Hangar. Passatge del Marquès de Santa Isabel, 40. Barcelona. Metro Poblenou.

Price: Free.

To sign up, please send an email to info@lullcec.org with a brief text outlining your background and motivation for attending the workshop. Note that applications won’t be accepted if candidates are unable to commit to attending the course in its entirety.

+info: [ http://lullcec.org/en/2013/workshops/drawing-weaving-and-speaking-live-generative-music/ ]

This workshop has been produced by l’ull cec for Hangar.

Appearances elsewhere

2013-04-17 12.49.15I got a couple of kind mentions etc lately:

That’s it! Hopefully I will survive all this attention.

New projects and events

Taking stock of the new and fast-developing projects I’m involved with.

Sound Choreography <> Body Code

A performance which creates a feedback loop through code, music, choreography, dance and
back through code, in collaboration with Kate Sicchio. First performance is this Friday at Audio:Visual:Motion in Manchester. The sourcecode for the sound choreographer component is already available, which choreographs using a shifting, sound-reactive diagram. I’m working on my visual programming language Texture as part of this too, which Kate will be disrupting via computer vision..

Algorave

Collaborating with other live coders and other musicians/video artists using algorithms, creating events which shift focus back on the audience having a seriously good time. A work in progress, but upcoming events are already planned in Brighton, London (onboard the MS Stubnitz!), Karlsruhe and Sydney. More info

Declaration Kriole

Working with world music band Rafiki Jazz, making a new Kriole based on the Universal Declaration of Human Rights. I’ll be working with a puppeteer, giving a puppet a live coded voice which sings in this new language. The puppet will hopefully become a new member of the band, created through interaction within the band. First recording session soon, with live performances to follow fairly soon after. One of the more ambitious projects I’ve been involved with!

Microphone II

Working with EunJoo Shin on a new version of the Microphone. Our previous version got accepted to a couple of big international festivals, but they turned out to be too big to ship! So the next iteration will have a new body, and more of a visual focus.

Slubworld

Slub world is a on-line commission from the Arnolfini: “You are invited to join a new, on-line, sonic world co-inhabited by beatboxing robots. Participants will be able to make music together by reprogramming their environment in a specially invented language, based on state-of-the-art intarsia, campanology and canntaireachd technology. The result will be a cross between a sound poetry slam, yarn bombing, and a live coded algorave, experienced entirely through text and sound.” All for launch in May.. Another ambitious project then.

Dagstuhl seminar: Collaboration and Learning through Live Coding

Co-organising a Dagstuhl seminar bringing together leading thinkers in programming experience design, computing education and live coding.

Plus more in the pipeline, including neuroimaging and programming, a sound visualisation project at Sage Gateshead and hopefully a return of the live interfaces conference and live notation project.

Audio blast festival

Audio blast is a streaming festival by apo33, running in both Nantes and Piksel festival in Bergen.

I’m performing this Saturday November 24th, for an hour from 7pm GMT (8pm CET).  I’ll be streaming quadrophonic sound from my studio in Sheffield, which will be played in both spaces, with a stream for remote listeners from two AKG mics in one of the spaces.  More info and link to the network stream on the website.  If anyone wants to pop by Sheffield for a listen and beer they’re welcome too :)

SmoothDirt programme notes

I’m doing a few solo performances over the next days, in Cambridge, Uxbridge and Birmingbridge.  Here’s the programme notes/rationale;

Yaxu – SmoothDirt

From a linear perspective of time, live coding will always be somewhat distant from human experience.  As computer programming is a fundamentally indirect manipulation of sound, is live coding really live?  If we consider the flow of time from past to future, the time necessary to modify an algorithm acts as an impenetrable barrier between coder and experience.  An alternative perspective is to think of time in terms of cycles. From this perspective, if a coder’s actions lag behind the present moment, then they are also ahead of it.  They are inside time, the cycle of development enmeshed with rhythmic cycles of music, in mutual resonance.  Smoothdirt is a simple language built around this simple idea, allowing extremes of repetition at multiple scales to be explored as musical performance.

Yaxu will produce broken techno from his laptop for around twenty minutes.