Category Archives: visualisation

Sonic Pattern and the Textility of Code

Really excited to be involved with this dream event in London next month:

Sonic Pattern and the Textility of Code

11am to 6pm, 13th May 2014
Limewharf, Vyner St, London E2 9DJ
£20/£15
https://www.eventbrite.co.uk/e/sonic-pattern-and-the-textility-of-code-tickets-11330352389

An event that brings together diverse viewpoints on weaving, knitting, live coding, dyadic mathematics, generative music and digital making, in order to see how patterned sound and threads allow us to both sense the abstract and conceptualise the tactile. We will look for a rich view of technology as a meeting point of craft, culture and live experience.

The invited speakers will explore aspects of making, process, language, material and output in the relation to their own practice and related contexts.

The discussion will be lead by Bronac Ferran, Janis Jefferies, and David Toop, and practitioners include Alessandro Altavilla, Felicity Ford, Berit Greinke, Ellen Harlizius-Klück, Alex McLean and Becky Stewart.

There will be audio-visual interludes through the day, including a screening of Ismini Samanidou and Scanner’s film Weave Waves, commissioned for the Sound Matters exhibition in 2013 by Craft Council, and a short performance by Felicity Ford.

The event will close with a live music performance from Leafcutter John, Matthew Yee-King and Alex McLean, exploring code, pattern and sound.

Curated by Karen Gaskill, Crafts Council

A collaboration between the Craft Council, ICSRiM (School of Music, University of Leeds), the Thursday Club (Goldsmiths), V&A Digital Futures and the Live Coding Research Network.

Made possible through funding and support by the Craft Council, Sound and Music, the Arts and Humanities Research Council and the Centre for Creative Collaboration.

Experimentallabor residency

I’m on the way to take part in a short residency in Dusseldorf, hosted by Julian Rohrhuber at the Robert Schumann School:

Fifth Experimentallabor Residency: Penelope’s Loom – Coding threads in antiquity, live notation and textile inspired programming languages
Structure can be result and origin of a dynamic process at the same time – a thought that is common to weaving, mathematics and music. Today, as programming has become a practice that is closer to improvisation than to machine control, this commonality becomes increasingly interesting for the arts. It is along these lines, in the fifth Experimentallabor Residency, that Ellen Harlizius-Klück, Alex McLean, and Dave Griffiths will rethink programming languages in the arts in conjunction with the history of weaving.
Introduction: Wed Feb 5 2014, 17:30, IMM Experimentallabor

Lots more events coming up, full list here.

Colourful texture

Texture v.2 is getting interesting now, reminds me of fabric travelling around a loom..

Everything apart from the DSP is implemented in Haskell. The functional approach has worked out particularly well for this visualisation — because musical patterns are represented as functions from time to events (using my Tidal EDSL), it’s trivial to get at future events across the graph of combinators. Still much more to do though.

Texture 2.0 bug exposure

Texture 2.0 (my Haskell based visual live programming language) is working a bit more. It has reached gabber zero – the point at which a programming language is able to support the production of live techno. Also I’ve made some small steps towards getting some of my live visualisation ideas working. Here’s a video which exposes some nice bugs towards the end:

This is an unsupported, very pre-alpha experiment, but if you want to try to get it working, first install Tidal (and if you want sound, the associated “dirt” sampler). Then download the code from here:

https://github.com/yaxu/hstexture

.. and run it with something like runhaskell Main.hs

 

PhD Thesis: Artist-Programmers and Programming Languages for the Arts

With some minor corrections done, my thesis is finally off to the printers.  I’ve made a PDF available, and here’s the abstract:

We consider the artist-programmer, who creates work through its description as source code. The artist-programmer grandstands computer language, giving unique vantage over human-computer interaction in a creative context. We focus on the human in this relationship, noting that humans use an amalgam of language and gesture to express themselves. Accordingly we expose the deep relationship between computer languages and continuous expression, examining how these realms may support one another, and how the artist-programmer may fully engage with both.

Our argument takes us up through layers of representation, starting with symbols, then words, language and notation, to consider the role that these representations may play in human creativity. We form a cross-disciplinary perspective from psychology, computer science, linguistics, human-computer interaction, computational creativity, music technology and the arts.

We develop and demonstrate the potential of this view to inform arts practice, through the practical introduction of software prototypes, artworks, programming languages and improvised performances. In particular, we introduce works which demonstrate the role of perception in symbolic semantics, embed the representation of time in programming language, include visuospatial arrangement in syntax, and embed the activity of programming in the improvisation and experience of art.

Feedback is very welcome!

BibTeX record:

@phdthesis{McLean2011,
    title = {{Artist-Programmers} and Programming Languages for the Arts},
    author = {McLean, Alex},
    month = {October},
    year = {2011},
    school = {Department of Computing, Goldsmiths, University of London}
}

RIS record:

TY  - THES
ID  - McLean2011
TI  - Artist-Programmers and Programming Languages for the Arts
PB  - Department of Computing, Goldsmiths, University of London
AU  - McLean, Alex
PY  - 2011/10/01

Attending to presentation slides

I had some fun with my talk at ICMC earlier this month.

I started in the usual way with an outline slide, going through bullet points one by one outlining the structure of my talk.  Importantly, I tried to talk continuously while the slide was up.

On the next slide was a picture of a boy throwing a stone into the sea, I talked about it for a while, making the point that it was easy to perceive the image while listening to my voice.  The audience hopefully found they could attend simultaneously to the visual scene and my linguistic speech.

I then skipped back to the previous slide and pointed out that the outline slide actually had little to do with what I had been saying.  Here’s the contents of that first slide:

  • A live coding talk towards the end of the conference
  • Some strange programming languages were shown
  • He made a point about cognition that I didn’t quite get
  • The demo didn’t work out too well
  • I was a bit tired but he seemed to be trying to say something about syntax

This got some laughs.  There were quite a lot of people in the room, and the slide had been up for a while, but as far as I could gather no-one had managed to read any of it.  My contention was that they couldn’t read it while listening to my voice, it’s too difficult to attend to two streams of language at once.  I didn’t really know what would happen, but from talking to audience members afterwards it seems at least some people got a sense that something was wrong, but couldn’t work out what it was until I told them.

This was a nice practical demonstration of Dual Coding theory, and lead into my argument for greater integration between visual and linguistic elements of computer languages.  However there’s probably a point in there about the design of presentation slides.  If you want people to listen to what you’re saying, put short prompts on your slides, but not real sentences, because the audience won’t be able read them while listening to your voice.

 

Workshop output

The Text live coding workshop went really well, surprisingly well considering it was the first time anyone apart from me had used it and (so I found out after) most of the participants didn’t have any programming experience. The six participants took to the various combinators surprisingly quickly, the main stumbling block being getting the functions to connect in the right way… Some UI work to do there, and I got some valuable feedback on it.

Once the participants had got the hang of things on headphones, we all switched to speakers and the seven of us played acid techno for an hour or so together, in perfect time sync thanks to netclock. Here’s a mobile phone snippet:

The sound quality doesn’t capture it there, but for me things got really interesting musically, and it was fun walking around the room panning between the seven players…

Text update and source

I’ve updated Text a bit to improve the visual representation of higher order types (you’d probably need to full screen to view):

I won’t be touching this until after the workshop on Saturday.

I’ve also made the source for the visual interface available here under the GPLv3 free license. To get it actually working as above you’d also need to install my tidal library, Jamie Forth’s network sync, my sampler, the nekobee synth, and somehow get it all working together. In short, it’s a bit tricky, I’ll be working on packaging soonish though.

Text

Text is a experimental visual language under development.  Code and docs will appear here at some point, but all I have for now is this video of a proof of concept.

It’s basically Haskell but with syntax based on proximity in 2D space, rather than adjacency.  Type compatible things connect automatically, made possible though Haskell’s strong types and currying.  I implemented the interface in C, using clutter, and ended up implementing a lot of Haskell’s type system.  Whenever something changes it compiles the graph into Haskell code, which gets piped to ghci.  The different colours are the different types.  Stripes are curried function parameters.  Lots more to do, but I think this could be a really useful system for live performance.

Visualisation of Live Code

I wrote a paper with Dave Griffiths and Nick Collins on the visualisation of live code, exploring ideas around live coding interfaces, accepted for the EVA London 2010 conference in July. A HTML version is below, or see the PDF Preprint.


Alex McLean (Goldsmiths), Dave Griffiths (FoAM), Nick Collins (University of Sussex) and Geraint Wiggins (Goldsmiths)

Abstract

In this paper we outline the issues surrounding live coding which is projected for an audience, and in this context, approaches to code visualisation. This includes natural language parsing techniques, using geometrical properties of space in language semantics, representation of execution flow in live coding environments, code as visual data and computer games as live coding environments. We will also touch on the unifying perceptual basis behind symbols, graphics, movement and sound.

Continue reading