BackForward Start Page Page List

The Making of The Symphony For The Post Justification Engine


This was such a fantastic project to be involved in. The Middlesbrough Art Weekender team created a massive and fantastic exhibition in disused factories and spaces across the town, a little like my Fine Rats International days, only far more professional. I got to attend an artists' meetup at MIMA and learn about everyone else's work, and the opening made me truly question my own work for being so safe. 




Pasted Graphic.pdfPasted Graphic 1.pdfPasted Graphic 2.pdf

Preliminary sketches: Thinking about 2D vs time


This project is a sister project to The Fake News Kaleidoscope, using many similar approaches although, this piece is more ambitious in that it set out to create audio having no audio as it starting point. It is mainly built in processing but other tools were used as well.  


Pasted Graphic_1.pdfPasted Graphic 1_1.pdfPasted Graphic 2_1.pdf


Prototype software made for the Post Justifcation Engine.


I began by testing the hypothesis that somehow you might be able use, or misuse data in an image and turn it into sound. 






Pasted Graphic 3.pdf



I then used a simple Hough line detection algorithm to plot out the basic "structure" of the image. This creates start and end points, which can be, if you play fast and wild, be coaxed into the start and end of a midi note and frequency. 


Screen Shot 2018-09-22 at 14.31.11.png


The instrument used, can be defined from a series of numbers. I tried to learn SuperCollider during this project, doing three tutorials on the train to Oxford. It's a tool for creating and controlling your own synthesizers and although I made some ridiculous sounds that must have annoyed my fellow travellers, I wasn't able to control them in any way, and so I will return to SuperCollider another day.


Screen Shot 2018-09-24 at 11.57.24.png


As I created crude animations of the engine at work, I threw opengl shaders at the animations. OpenGL shaders are used in computer games to render complex scenes. The code is easy enough to hack. I believe, because they talk straight to your chip, they can create effects that otherwise wouldn't be possible. Go see some here https://www.shadertoy.com/browse  and feel your computer getting warmer. 



images.png



In one of the processes use, I turned the images into black and white versions, above, and then read the image from left to right, turning it into a "piano roll" of sorts. After many other iterations of wildly different algoriths, such as hough line detection and random sampling etc, I was surprised to find that that the "images", or at least a vaguely recognisable version of the image was discernable in the midi produced.


You can almost see the "man in the middle by the window", the "pointy bits" in the first image and the third shell. Almost.



PJE Midi shot.png