Punk robots learn to pogo

It’s the second of three punk gigs in a row for Neurotic and the PVCs, and tonight they’re sounding good. The audience seem to be enjoying it too. All around the room the people are clapping and cheering, and in the middle of the mosh pit the three robots are dancing. They’re jumping up and down in the style of the classic punk pogo, and they’ve been doing it all night whenever they like the music most. Since Neurotic came on the robots can hardly keep still. In fact Neurotic and the PVCs might be the best, most perfect band for these three robots to listen to, since their frontman, Fiddian, made sure they learned to like the same music he does.

Programming punks

It’s a tough task to get a robot to learn what punk music sounds like, but there are lots of hints lurking in our own brains. Inside your brain are billions of connected cells called neurons that can send messages to one another. When and where the messages get sent depends on how strong each connection is, and we forge new connections whenever we learn something.

What the robots’ programmers did was to wire up a network of computerised connections like the ones in a real brain. Then they let the robots sample lots of different kinds of music and told them what it was, like reggae, pop, and of course, Fiddian’s collection of classic punk. That way the connections in the neural network got stronger and stronger – the more music the robots listened to, the easier it got for them to recognise what kind of stuff it was. When they recognised a style they’d been told to look out for, they would dance, firing a cylinder of compressed air to make them jump up and down.

The robots’ first gig

The last step was to tell the robots to go out and enjoy some punk. The programmers turned off the robots’ neural connections to other kinds of music, so no Kylie or Bob Marley would satisfy them. They would only dance to the angry, churning sound of punk guitars. The robots got dressed up in spray-painted leather, studded belts and safety pins, so with their bloblike bodies they looked like extra-tough boxing gloves on sticks. Then the three two-metre tall troublemakers went to their first gig.

Whenever a band begins to play, the robots’ computer system analyses the sound coming from the stage. If the patterns in it look the same as the idea of punk music they’ve learned, the robots begin to dance. If the pattern isn’t quite right, they stand still. For lots of songs they hardly dance at all, which might seem weird since all the bands that are playing the gig call themselves punk bands. Except there are many different styles of punk music, and the robots have been brought up listening to Fiddian’s favourites. The other styles aren’t close enough to the robots’ idea of punk – they’ve developed taste, and it’s the same as Fiddian’s. Which is why the robots go crazy for Neurotic and the PVCs. Fiddian’s songs are influenced by classic punk like the Clash, the Sex Pistols and Siouxsie & the Banshees, which is exactly the music he’s taught the robots to love. As the robots jump wildly up and down, it’s clear that Neurotic and the PVCs now have three tall, tough, computerised superfans.

Machines Inventing Musical Instruments

by Paul Curzon, Queen Mary University of London

based on a 2016 talk by Rebecca Fiebrink

Gesturing hands copyright www.istock.com 1876387

Machine Learning is the technology driving driverless cars, recognising faces in your photo collection and more, but how could it help machines invent new instruments? Rebecca Fiebrink of Goldsmiths, University of London is finding out.

Rebecca is helping composers and instrument builders to design new musical instruments and giving them new ways to perform. Her work has also shown that machine learning provides an alternative to programming as a way to quickly turn design ideas into prototypes that can be tested.

Suppose you want to create a new drum machine-based musical instrument that is controlled by the wave of a hand: perhaps a fist means one beat, whereas waggling your fingers brings in a different beat. To program a prototype of your idea, you would need to write code that could recognize all the different hand gestures, perhaps based on a video feed. You would then have some kind of decision code that chose the appropriate beat. The second part is not too hard, perhaps, but writing code to recognize specific gestures in video is a lot harder, needing sophisticated programming skills. Rebecca wants even young children to be able to do it!

How can machine learning help? Rebecca has developed a machine learning program with a difference. It takes sensor input – sound, video, in fact just about any kind of sensor you can imagine. It then watches, listens…senses what is happening and learns to associate what it senses with different actions it should take. With the drum machine example, you would first select one of the kinds of beats. You then make the gesture that should trigger it: a fist perhaps. You do that a few times so it can learn what a fist looks like. It learns that the patterns it is sensing are to be linked with the beat you selected. Then you select the next beat and show it the next gesture – waggling your fingers – until it has seen enough examples. You keep doing this with each different gesture you want to control the instrument. In just a few minutes you have a working machine to try. It is learning by example how the instrument you are wanting works. You can try it, and then adjust it by showing it new examples if it doesn’t quite do what you want.

It is learning by example how the instrument you are wanting works.

Rebecca realised that this approach of learning by example gives a really powerful new way to support creativity: to help designers design. In the traditional ways machine learning is used, you start with lots of examples of the things that you want it to recognize – lots of pictures of cats and dogs, perhaps. You know the difference, so label all these training pictures as cats or dogs, so it knows which to form the two patterns from. Your aim is for the machine to learn the difference between cat and dog patterns so it can decide for itself when it sees new pictures.

When designing something like a new musical instrument though, you don’t actually know exactly what you want at the start. You have a general idea but will work out the specifics as you go. You tinker with the design, trying new things and keeping the ideas that work, gradually refining your thoughts about what you want as you refine the design of the instrument. The machine learning program can even help by making mistakes – it might not have learnt exactly what you were thinking but as a result makes some really exciting sound you never thought of. You can then explore that new idea.

One of Rebecca’s motivations in wanting to design new instruments is to create accessible instruments that people with a wide range of illness and disability can play. The idea is to adapt the instrument to the kinds of movement the person can actually do. The result is a tailored instrument perfect for each person. An advantage of this approach is you can turn a whole room, say, into an instrument so that every movement does something: an instrument that it’s impossible not to play. It is a play space to explore.

Playing an instrument suddenly really is just playing.