Skip to main content

A weekend building an arduino based robot


[The bot itself, looking sombre]

Last month, I went away to Stourbridge, a great centre of technology and robotics in the UK's scenic midlands. Here are some notes, pictures and film from that journey.

(I lied about Stourbridge).

The little train that brought me over from the main rail routes felt like a mix between the Totoro cat train and the slowly chugging train of death that inevitably carries away dead steampunks.

I later learned it had been built with very little money, so the health and safety was very minimal, and it was always breaking down.

But my friend Mat lives on the side of this sleepy town. He said that to one side of his house there was wasteland and empty industrial buildings. The other side, he warned was a land of chavs with blue neon underlit cars, there were also pubs and strip clubs, and lots of nurseries.

For a long time, Mat had been collecting arduinos and sensors of all kinds, hoping for a weekend of calm in which to play with it all. He invited me over to build a sonic robot.

So I had gone over for a weekend of playing with cool tech, which turned out to include generative sound with PD and Max MSP, arduinos, robots, cameras, c++ libraries, and lots of gaming. It's for a show he wants to do next year.

We also talked about the idea of a combined virtual and real tree: perhaps feeding into each other. My tree would live in Second Life or Opensim, and would be made of a robotic avatar, which would talk to it's attachments and generate detachable fruits if it was paid Lindens or fed nice textures. Mat's would live in real life, made of arduinos, servos and LEDs, and might for example produce virtual fruits with a GPS location that you'd need an AR platform to go and find them with.

So Thursday was a quick introduction to the arduino. We did the first hello world tutorial and got out Mat's extensive collection of arduinos, roboduinos, arduino diecimila, cornettos, arduino super maxis, calippos and mars bar ice creams. So after getting one LED's worth of blinking satisfaction, we created two little things out of plastic:




To make these, we sawed a CD box cover in half, attached three ultrasonic oscillators on either side (which were bought for a tenner each at coolcomponents.co.uk). And inside the box covers was a little arduino, controlling it. Our idea was to then wire this up so that music could be played based on it's tactile nature, and based on some 80s experiments with infra-red range finders.

But then we got out PD, and started playing around with that, as well as getting out loads of computer games like Left 4 Dead and Fight Night round 4 - great inspiration for messing with electronics and sound generation.

So next morning came a fun Puredata workshop, downloading it, setting it up on linux, and generating random sounds in our stoic Stourbridge surroundings, a local JD Wetherspoons, to the annoyance and tutting of the locals. I learnt that you can wire up Playstation 3 controllers to PD via USB and they will control anything on the PC. It looks a lot harder to figure out than Max MSP but in it's help files' introduction page it mentions Xenakis and Stockhausen.

And then, after some stir fry, we put together a robot kit, the kind you buy at bookshops or museum shops. This one was from Robot Shop (although I can't find a link to it direct. Mat says it's called the Rover though). We ripped out the frame and the wheel motors and attached them to an arduino. After we'd put 20 tiny bits of metal inside 20 tiny little holes on each track, it was quite easy to get the arduino controlling a single tracked wheel, but there was not enough power to run them both, much less to carry loads of shit around like a robotic pack horse mini me. It would have needed a transistor and a 9v battery, which means a trip to Maplins.

So we sidestepped that whole issue and added the contents of a £50 Edimax webcam and wifi pack to the robot.


[To indoctrinate robots into the human world, it is traditional to filially imprint them with some late night Jonathan Ross]



We removed the webcam from the white side of it's plastic container, displaying it's internal LED(s) as well. This we then taped on to the robot box. and on top of the whole thing went the WIFI router, which was also in the pack from edimax. I soon downloaded an android IP cam app, and had connected to the camera over the wifi router, although being plugged in doesn't make our little robot any better at being autonomous...

But we could add batteries at some point. It needs either a 6v, 9v and 12v battery, or just a rack of AA batteries like a radio controlled car.




[Slightly menacing blue lights, again showing a strange resemblance between television and sound bot]



We also thought of using one of the range finders for it, so that it would have an easy way of avoiding obstacles, but then Mat's games with video to audio pretty much got rid of that need.

And we took loads of photos of the photos it was taking. But to get the camera working on linux, I needed a bit longer, as I had to download all the info and a big 150 meg library to my g1 (As a sign of the times, there was no working internet in the flat, but we both had it via 3g on our phones, which we'd use to transfer files via USB cable).

Meanwhile, I wrote my first C++ library for arduino, stealing shamelessly from a Twitter library, found online, which was able to deal with basic authentication.

So easy to make robots nowadays! I bet in future our robot dolls will be home made too.

I'm back now, a few hours later. We've taken a slight detour with a Max/MSP patch that interprets Mat's Mac's camera and turns it into ostinato piano notes based on how bright each pixel is. After that, Mat enlarged it by 250 times, by taking each pixel in a row and doing this based on the average. It's about as good as the range finders were to begin with. We had loads of fun and made a couple of videos of Mat playing his computer like a piano sampling theremin.

It's also, to my mind, a lot more accessible than a range finder, ultrasonic or IR, because everyone has a camera in the UK and everyone is being videoed or is watching video constantly through the day.

Imagine an installation where you took over all that CCTV in a space, and used it to generate sound based on the people passing through it.



I kept working on my C++ library, and it is now able to log in, request a page, and get an error code of 200 from the camera. Also, you can now install it in the arduino development environment and it will run out of the box, even giving an example. It's not much really, but it was late... And with the pdf of the cgi calls, it will soon be able to deliver lots of info to the arduino based on the images it sees.

Next step will be to get the little camera for the robot sending data back to the Arduino, so that it can pick up visual info and use it to tell where it's going. I had a geeky time messing around with the arduino and it's Ethernet shield, getting it to be a web server and client(ooh, arduino p2p, I count the days until your birth), but unfortunately still not managing to get it to connect to anything useful.

Mat's final vision for the robot is that it is eventually able to find it's way around using the camera, and able to find the most musical place, so that in it's symbiotic relationship with humans, it can deliver pleasurable noises. We also still kept the first basic idea of using our original rangefinder interface, maybe mounted on the robot, or in an area close to it, so as to allow people to play duets with the robot, and find even better noises based on what it finds.


[Our little robot relaxes after a long and arduous hacking session]


How to do memory:
Instead of storing sound samples on an SD card (which could also be good, for obtaining on-board recordings), it could store much more data based on the images it sees, before it turns them into noise. On a 9 hour day, 15 frames a second, so for a minute it would need 900 samples. So for an hour, 54000 samples. it would have 486000 individual samples to take care of. BUT each row of elements has 256 individual integers. So that gives 124 416 000 bits stored in a day, i.e around 15MB.

But that's only if it was gathering this stuff all day: it only needs to gather the stuff it likes(based on an algorithm Mat's got in mind), so maybe a 30 sec vision based sample at most, which would then be stored on the SD card so that it could be played back in combination with other stuff it liked. Samples could maybe be stored in 3 types (Lady Ada's wave shield allows the playing of 3 sounds at a time - enough for percussion, chord/drone and soloist/multiple tones).

Mat's algorithm would do a fitness test:

is this a C major chord, or a scale, or something I can identify?
Yes:
What type is the sound: percussive, slow tone, quick tone?
store a segment(type)
Move slightly forwards or back to try and get nice variations.
No:
Turn X degrees left or right.


Arduinos and bits bought from:
Devonrobotics.co.uk, corecomponents.co.uk
Robot shop construction kit: £40 including an arduino!
Lady Ada Wave shield kit: also quite cheap, and including another Arduino.
skpang.co.uk : very cheap bits and oscillators.

Comments

Popular posts from this blog

How could Scotland have a more resilient food system?

The question that's led me to write this has been how do we adapt long term, specifically here in Scotland, where I live now, to coronavirus and the idea that pandemics are here to stay if we stay this industrialised and globalised. How does a society look if it's to be resilient long term, sensible and ecologically regenerative? We do know a few things about this novel coronavirus that we seem to be slowly figuring out as it evolves and spreads, and I have almost a picture of how it might look in my head. Here is the closest I can get so far to it, on a regional scale at least: In the picture, each block is a community of several households and work spaces, and each green space is where they grow crops, or graze animals. So why this system? Around the time when it was obvious a lockdown was coming, I read a community organisation manual that mentioned how graph theory applies to limiting the spread of something like Covid-19. It advocates getting together with your clos

Poem for peace, from a pirate recording in a Cairo Museum.

My love With peace I have placed loving flowers at your feet With peace With peace I stopped the seas of blood for you Forget anger Forget pain Forget your weapons Forget your weapons and come Come and live with me my love Under a blanket of peace I want you to sing, beloved light of my eyes And your song will be for peace let the world hear, my beloved and say: Forget anger Forget pain Forget your weapons Forget your weapons and come And live in peace These I believe are the words of a widow at the tomb of her beloved. I got the words from this italian website . It was used in a seminal Italian anti-war song " Luglio Agosto Settembre Nero " by the band Area (although I guess they weren't called anti-war songs then) - whose vocalist Demetrio Stratos indirectly gives the name to this blog, and whose music is the inspiration for a lot of my mine. It's adapted in turn from a greek folk song, but no-one knows who wrote the original words, except that Stratos was probably