Sunday, 16 March 2014

Arduino range sensors

This beauty is the very wonderful HC-SR04 proximity sensor: an ultrasonic raneg finder...

It bouces high frequency sound waves off objects in its line of site...

I have tow connected: left and right. This is the output of the serial monitir
The sketch that controls this is trivial

Tuesday, 11 March 2014

Tweetenstein's face stitched together - animated Javascript Lip-synch TTS from Google Translate API

After a lot of fiddling over the last few days, I've managed to make a crude html page animate to the text-to-speech output from the Google Translate API

This is a still of the face, showing the two eyes and one mouth.

The mouths are taken from a lip-reading example, then a plastic effect filter effect in Photoshop

On the left, before. On the right, after

And this is the slightly unsettling look you get. These eyes are done in the same way, but are actually my own eyes put through the same filter...

Ande here are examples of mouths


The javascript that is doing this is in here:

This needs a local server running as it loads an external (JSON) data file...

It is inside an html page thus

Wednesday, 5 March 2014

Tweetenstein - full-sized data-connected automaton

This beauty is destined to become the next data-fed automaton, Tweetenstein!
Tweetenstein is the successor to Twitr Janus and the Psychic Hive Mind Fortune Reader, both of which used simple web services to control or manipulate responses.

The basic concept is that Tweetenstein starts life as a corpse (wel mannequin) and is given life by plugging it into social data from the web, mainly from conversations on Twitter. It is yet to be decided what form this reanimation will take, but starting with a slightly disturbing 6 foot blank humanoid shape has to be a good thing, right?

The pug gets the vibe. Great startled expression

Here touching a fake hand is being used as a novel form of input device. A simple Michaelangelo-esque touch of fingers causes the web app to randomly select and read Twitter screen names out as speech,

The words on screen are a visualisation of this data. The app is converting it to speech using a Google Translate API call, triggered by an onClick method in JavaScript. This is being physically triggered by the Makey Makey board.

Makey Makey is a pimped arduino that converts touch into keyboard input for any running app :)

It works by shorting out two terminals - an earth terminal and one of several a signal terminals. Here we are using the space bar.

Hidden underneath in this crud prototype, is a wire that is connecting the finger tip to the Makey Makey signal terminal