Untitled 47, 2021

Sony RX100 VI, ffmpeg, opencv, fsynth, GLSL

This is an algorithmic piece that uses a clip of digital video as its source and interprets the information in ways it was not originally intended for. The actual movie file shown here is a screen capture showing a sample of the process. It continuously churns the input data and produces a stream of endlessly unique content.

Art, Collection, Electrons

BrowniePi

Art, Electrons, Photons, Project

“A terrible idea that should never have been tried!” – Ansel Adams, April 2021.

This image is captured by shooting the film back of a Brownie 620 camera using a Raspberry Pi camera mounted inside the black chamber above the lens just outside the optical path. High gain screen material is used on the image plane to maximize the reflected light. It works, sort of… The current limitations are that it ends up being very low light so in full sun, I have to use the highest ISO available and about a .25 second exposure. Also, since the sensor is not in line, the captured image is skewed and you end up with soft focus on the top and bottom.

First image, un-warped and brightened.

It’s a fun project, and I’m sure I’ll complete it someday. I am waiting to see if a higher sensitivity chip will become available for the RPI. Ultimately, the best solution would be to  manufacture a lens element that would refocus the 120 film size beam onto the small sensor mounted in the back, or a concave mirror on the back that would reflect the image onto the sensor where it is currently mounted (both of these solutions are beyond my pay grade).

This uses a Raspberry Pi Zero W and a small LCD panel, both powered by a Pi Sugar battery. Operating system is diet pi. The camera is operated with the buttons on the LCD panel board and the resulting image is displayed on the screen. Using VNC and a virtual desktop, I also get the live feed directly from the camera to a laptop. The pictures are saved with the raw information and converted to DNG. I designed a 3D printed adapter to mount the electronic components on the body of the camera but I am waiting on more light sensitive solutions to finish the piece.

Memory or Memento?

Art, Atoms, Collection, Electrons, Thoughts

One is the ghost in our minds of a past sensory experience and the other is a physical thing we hope can be a gateway to the first. Does the promise of preserving memories destroy them by creating an unmanageable clutter of infinite possibilities which merge into a cold immaterial surface offering no comfort and condemning us to anxiety?

We want to reclaim power over defining the topology of our interior landscapes. It involves art, hammers, digital storage, authentication, originals, blood, and an evening of fun with people.

 

Postcards From The Upside Down, 2020

2D Pixel Arrays

Slide Slide Slide Slide Slide Slide Slide Slide Slide Slide Slide Slide
Art, Collection
The Land Is Screaming, 2020

Video Datagraphs. Sony RX 100VI, opencv, ffmpeg.

Slide mole Slide venules Slide bruise Slide rosaecia Slide jaundice Slide tightening Slide marks Slide fibrillation Slide gauze Slide scrape Slide incomplete
Art, Collection
Sketchz, 2014-2020

Processing Sketches, Custom Linux, PyQT, Teensy, wood, glass.

Displaying algorithmic art

The design for this custom display device started around 2014. At that time, I was working on developing algorithmic art and growing dissatisfied with the lack of means to show it outside the usual computing context. Seeing a mouse and keyboard open up this kind of imagery in a window over a familiar desktop OS on the usual computer hardware dilutes the nature of the experience.

These pieces typically involve coming up with a set of rules that describe how the image is drawn and how it animates. Random variations are introduced in the input parameters and end up generating different versions, each unique but all clearly belonging to the same family. After studying them for a while, you get an intuitive feel for the underlying process shaping them and you automatically start to anticipate how they will continue to develop. A tension develops back and forth between the expectation of the viewer and the validation (or not) of those expectations in the visuals, like watching crashing waves on the shore.

Each iteration of these recipes becomes a mini story all its own, with a beginning, a middle and an end. It’s not a coincidence that coders usually use variable names reflecting events in the natural world: birth, death, child, branch, root, etc… The random occurrence of these events gives each of these visual stories its unique character.

In the best of cases, the growing intuition that underlying laws have the potential for infinite manifestations invites contemplation in ways found in classical Islamic art.

The intent of this display is to create a space where, even if you don’t see God, at least you won’t have to run into Clippy.

Art, Collection, Electrons

Video Datagraphy

Art, Collection, Photons, Thoughts

The word “slitscan” is originally the name given to a specific type of photographic lens that uses a thin and tall rectangular aperture that is moved horizontally to create an exposure. Instead of exposing the whole film surface at once through an iris, these lenses capture the light over time and across the length of the negative, like a scanner or a rotary printing press. They have typically been used for capturing very wide horizontal perspectives in landscape photography or group photos, as well as to create optical visual effects. With the advent of digital video, this process can be expanded on quite a bit to generate surprising visuals which end up in a place between abstraction and representation where they can feel both familiar and strange at the same time.

Digital video can be thought of as a cube of data. Each image is a two dimensional plane of pixels with X and Y coordinates, and these image planes are stacked on top of each other like the floors of a skyscraper. In this cube, a frame in the original video is the ‘XY’ plane at height ‘t’.  In our building analogy, this would be the floor plan at a specific floor. What we usually think of as a slitscan is the Yt plane for coordinate X, or to continue the skyscraper analogy, a cross sectional slice of the whole height of the building. Ultimately, this cube of data can be processed, dissected, or remixed in arbitrary ways to the point where the name “slitscan” no longer even makes sense. Video datagraphy is a more appropriate description of the process: making images from video data.

 

When we navigate the physical world, we use a constantly changing and always singular perspective to build a mental structure representing what is around us beyond what we can directly experience at that moment. Though these models persist through time in our consciousness, we can never experience them holistically. We are bound by the laws of physics and can not ACTUALLY wrap ourselves around them. As a substitute, we look for patterns that can give us some reference as to where we are on that continuum: a heart beat, light patterns, seasons, music, speech… Tracking these linear signals informs our conception of space beyond our current perspective and anchors our experience on the mysterious expanse of time,

Storytelling similarly spans time and space. It paints narrative arcs that connect specific events lost inside an infinity of places and moments, and provide a scaffolding on which our understanding of the world is conveyed. Even though stories conjure up a god-like perspective above the physical constraints of our human experience, they are typically linear in nature. Like the 3 dimensional shadows of a 4 dimensional hypercube we can never actually experience directly, or like the chained humanity in Plato’s cave watching shadows on the wall, they only hint at the existence of a greater context. There is an unfolding that happens and reveals a new dimension. The Tapestry of Bayeux is a wonderful example of this linear visual rhythm that unfolds on a timeline, and its horizontal shape is unsurprisingly similar to video datagraphs which represent samples of time and place on a two dimensional surface.

The physical principles of traditional photography are similar to the way vision works in that it is limited to a moment in space and time, and they actually do so 24 times a second in film. These space time video samples do not share these limitations. They feel alien because the process that reveals their hidden structures is not physically achievable for humans but they feel familiar because we have intuited them mentally from how we have experienced what they are representing.

 

Casting a freeway interchange

Art, Atoms

If you live in LA, you probably are familiar with huge freeway interchanges. It’s an impressive network of forming a big knot that somehow channels various streams in all the possible ways. It has huge curves sweeping above and below the main wide arteries. Anyway, they are cool. My favorite is where the 110 and 105 interchange and I’ve been mulling over making a cement casting of it for a while now. It starts with grabbing 3D data from the interwebs and turning it into a functional model.

Once the data is prepped, I prepare it for 3D printing. I want the final piece to be 26 inches wide which is wider than any printer I could get my hands on so I break it up in tiles and run some tests.

Next step is to figure out the process of making the mold. There two options but the first step for each is the same: glue the tiles back together (there will be 16 of them) and use bondo, sand paper to smooth out all the irregularities, and spray paint some urethane to smooth over the result. The question is: should I make the urethane mold directly from the positive 3D prints, or should I create a positive plaster intermediate I can tweak and further polish, and make a mold out of that? If I go with the plaster, I worry that the brittle nature of the plaster and the rigidity of PLA filament will cause the small details to break as I release the plaster image from the 3D print. If I pour the mold directly on the 3D print, I worry that the tell tale 3D print lines will be captured by the mold and that I will not have had the chance to polish it with the control a plaster intermediate would give.

So, next step is to test both approaches on my two test prints, and also test printing a tile with PETG filament which I hear is more flexible. If you have any recommendations, I’m all ears…

June’s pot harvest

Art, Atoms

This month, I am deciding to make bigger and heavier pots, but still keeping with the monolithic shapes. I was spurred into creative action by a cardboard tube I saw in a trash pile at work. I noticed it and a lightbulb flashed in my head; I immediately grabbed it, knowing exactly what I was going to do with it. The world is full of gifts indeed and I love the process of finding sudden and unexpected inspiration at random time. All you have to do is remain open, hone your discernment skills, and channel whatever comes your way. Also, I’m experimenting with black cement coloring and black river rocks on my old 6 inch round model.

Video Triptych

Art, Atoms, Electrons

Building a synced video tryptic with Raspberry Pies and laptop screens

A filmmaker friend, Natasha Maidoff, approached me to design a display device that would allow her to display a video triptych at the 2015 Venice Art Walk. The idea was to create a frame that would contain all the components necessary to play three different videos on three separate screens in perfect sync. It should be simple and should not need to be connected to a bunch of cables or computers. The goal was to hang it, plug it, turn it on and forget it… Oh, and did I mention it should not cost an arm and a leg?
Based on my previous tinkering activities, I felt that this project was within my reach and so I started researching parts. My first order was for 3 Raspberry Pi 2 mini computers, 3 replacement laptop LCD screens and the 3 driver boards they needed to be connected to the Pi’s HDMI output. Along with that, I got the various necessary AC adapters to power all this mess.

Framing and mounting the electronics

The first order of business was to create an aluminum frame to mount the screens to. I got some standard aluminum rods from home depot and cut them to size. I drilled some small holes in the vertical aluminum studs and mounted the screens by using the small screw holes along their edges that are designed for them to be mounted in the laptop. Next, I cut two horizontal struts that I screwed the vertical studs to and ended up with a nice sturdy frame that the screens were securely mounted to. For the top strut, I used an angle piece, and for the bottom strut, I used a plat piece that was wide enough to mount the electronics. A few holes, and 18 small bolts, nuts and screws later later, the electronics were on! I plugged everything in and it all lit up just fine…

Networking the three Raspberry Pi

The next order of business was figuring out how to get the machines to talk to each other. The raspberry Pi comes with a set of GPIO pins that I could have wired to send signals back and forth between the machines, but it would have required a fair amount of coding and wiring to get the control I needed. Also, since I quickly realized I would need ethernet to log in and customize the raspbian operating system on each individual machine, I decided to piggy back on this and use OSC for communication over TCP/IP. I got a small USB hub, pried open the cheap plastic case, liberated the small electronic circuit housed therein and proceeded to mount it to the frame itself. I connected ethernet cables between the hub and the three machines, and also connected the hub to my home router which allowed me to connect to and set up the Raspberry Pies from my laptop.
For this to work reliably and predictably, I changed the Pies’ IP address to be static, and while I was there, I changed the hostnames, and set it up to automatically log in to the command line without loading the desktop interface.
To set up the static IP to 192.168.11.45, edit /etc/network/interfaces and replace:
iface wlan0 inet dhcp
with:

iface wlan0 inet static
address 192.168.11.45
netmask 255.255.255.0
gateway 192.168.11.1
network 192.168.11.1
broadcast 192.168.11.255

To setup the hostname, replace it to your desired name in /etc/hostname, and also list all the other hostnames with their static IPs in /etc/hosts.
Lastly, to log directly into the command line interface without having to enter a username and password, edit the file /etc/inittab and replace the line:
1:2345:respawn:/sbin/getty 115200 tty1
with:
1:2345:respawn:/bin/login -f pi tty1 /dev/tty1 2>&1

Getting rid of warts

All this electronickery runs on 12V and 5V DC, and typically, you use one of these a lovely black boxy power adapter for each component. At this point in the build, I had everything working but I was relying on 7 of these wall warts plugged into a power strip to get it all powered. I didn’t have room on the frame itself to mount all this crap and I didn’t want 7 wires coming out of it so I started looking into a better way to supply power. The monitors needed 12V and each seemed to be running along fine with a 2A supply. The Raspberry Pies and the ethernet hub each used 5V and 1A each at most. I searched on ebay and found 12V power supplies usually used for home LED lighting, and figured that getting one rated for 10 amps would be more than enough to cover the requirements of the frame. I also got a step down converter for the pieces that required 5V.
Wire time… AC power cord into 12V power supply, 3 sets of wires with DC power barrel jacks connecting the 12V output to each of the monitors, 1 set of wires connecting to the 12V to 5V step down converter. 5V output of the down converter to the raspberry pies and the ethernet hub. One of the challenges here was trying to solder wire to the micro USB plugs I had bought. It’s all way too small and I ended up with an ugly heap, kind f like Jeff Goldblum and the fly in “The Fly”, except with melted lead and plastic. In the end, I just bought some USB cables with the proper connection, cut them and rewired them to fit my needs.

The software

Auto start

I needed the movies to start automatically when the machines were done booting, so I modified the /home/pi/.bashrc file to run my main script whenever it was being accessed at the end of a boot sequence (I didn’t want to launch my movie playback script every time I logged in from another machine). I look for the $L1 environment variable that gets to “tty1” only when logging to the console from the boot sequence. I added the following at the top of .bashrc:

if [ “$L1” == “tty1” ]
then
sudo /home/pi/video_player.py
fi
Movie playback

For movie playback, I made the obvious choice and used omxplayer which is custom written for the Raspberry Pi hardware and can play full 30fps HD video from the GPU, and there’s even a crude little library called pyomxplayer that allows control from python. In order to get the pyomxplayer library to run, I had to install the pexpect python library which allows it script to spawn and control the omxplayer process. Also, pyomxplayer tries to parse the text output by omxplayer but it seems like that part of the code has changed and causes the script to fail and exit so I had to remove that part of the code. I also added a function to allow me to rewind the movie. As soon as my script starts, omxplayer loads the appropriate movie file and pauses at the beginning.

Syncing the movies

As for syncing the start of the three movies, I used pyOSC to have the machines automatically establish a connection when they boot up and unpause the movies at the same instant when all three machines are ready. The basic process goes like this: I designate one machine to be the master and the two others to be slaves. When the master boots up, it first listens for a signal from each the slaves, and stays in this mode until it has heard from both. On their end, the slaves’ first action during launch is to send a signal to the master. As soon as the master has heard from both slaves, it tells the slaves to switch to a state where they listen to the master for commands. At this point, the master unpauses the movie it previously loaded and tells the slaves to do the same with theirs. Since omxplayer has no looping function I could find that worked for me, I have the master wait for the length of the movie and then rewind to movies to the beginning and start them playing over again.

Playing back from the RAM disk

In order to avoid continually reading from the SD card, I created a ram disk so my script could copy the movie file to it and allow omxplayer to play back from there.
I created the directory for the ram disk’s mount point:
sudo mkdir /var/ramdisk
and added the following to the /etc/fstab file:
ramdisk /var/ramdisk tmpfs nodev,nosuid,size=500M 0 0
The Raspberry Pi 2 comes with 1 GB of ram so I used half of it for the drive, which left plenty for the OS to run.

Timing issues

Fundamentally, pyomxplayer and the pexpect approach it uses is an ingenious but somewhat hacky way to control the process and it took me a long time to get everything to work properly. I found that if my script sent commands to omxplayer too fast, omxplayer would miss the command. I had to put a bunch of sleep statements in my code to pause and allow enough time for commands to get properly “heard” by the omxplayer process. Also, I added a long pause right after the movie is first loaded into omxplayer to make sure any residual and proc intensive boot process has a chance to finish and does not interfere with the responsiveness of each machine. It’s far from a robust setup; It’s basically a script that tells three machines to press the “play” button at the same time. Ultimately, though, I seemed to be able to get the movies to sync well enough across all three machines. Not guaranteed to be frame perfect every time but probably within one or two frames on a reliable basis. Ultimately, I would love for someone to write a true python API for omxplayer.

Powering off and rebooting through GPIO

Since the frame will be fully autonomous, it will not have a keyboard to allow an operator to properly power off the Raspberry Pies. This is a challenge because if you directly unplug the power to turn them off, the SD card will most likely get corrupted and you will not be able to reboot the machine. So, I needed to setup a simple way to cleanly shutdown the machines before turning the power off. I also wanted to be able to reboot if somehow something unexpected happened that threw the machine out of sync. So, I connected the master machine to a couple of switches and set the script up to power off or reboot if the GPIO interface detects a specific button press.

The code

Here is the code. You will need to install pyomxplayer, pexpect, and pyOSC. Also, depending on your version of omxplayer, you may need to modify pyomxplayer because of the way it attempts to parse omxplayer’s output. This script works for both the master and the slave based on the hostname. Here is the basic process:

When all machines boot up, they copy the relevant media to the ram drive, and launch it in omxplayer in a paused state
The slaves then go into a startup loop sending a “ready” address to the master until they hear back.
The master goes into a loop to check if both slaves are ready, listening for a “ready” address from each.
When the master determines that both slaves are ready, it sends a “listen” address to the slaves and the slaves come out of their startup loop.
From here on out, the master controls the slaves through OSC to unpause/pause/rewind the movies indefinitely.
The machines either reboot or shut down when the specific GPIO pin gets set on the master.

Framing


Pretty quickly in the process, I realized that with the amount of work I had on my plate, as well as my day job, I would not have the time to build a frame that would do justice to the project. Natasha brought in Hugo Garcia to help. We went back and forth between modern designs that would let you see through to the electronics and the more traditional framing approach we ended up settling on. Since the screens and computers were all mounted to the central aluminum structure, the process was not too difficult; it was just a matter of hanging a skin on it. Hugo came up with a cool Arts and Craft inspired box frame and an ingenious setup of brackets that sandwiched the electronics in place. The most sensitive part was getting precise measurements of the screens to make sure the CNC’d matte fit perfectly on the screens.
The last few details consisted of mounting the tactile switches connected to the GPIO pins for shutting down or rebooting the machines, soldering on a mini jack connected to the master Raspberry Pi and provide audio from the outside, and lastly, installing a main power switch for the whole unit.

Hanging

Here it is… We’re very happy! The work looks great. After seeing Natasha’s piece running on it in perfect sync, we’re all very inspired to work with this tryptic format. There are so many ideas to explore. Since we’re hoping for this to sell at the auction for the Venice Family Clinic, we’re already planning to build another one for us to play with.

Parts list

In case you are foolish enough to go down the same route I did, here’s what I used:
3 raspberry pi 2
3 LTN156AT02-D02 Samsung 15.6″ WXGA replacement screens (ebay)
3 HDMI+DVI+VGA+Audio Controller Board Driver Kit for LTN156AT02 (ebay)
1 D-Link – 5-Port 10/100 Mbps Fast Ethernet Switch
DC 12V 10A Regulated Transformer Power Supply (typically used for LED strips)
DC-DC 12V/ 24V to 5V 10A Converter Step Down Regulator Module
Aluminum, mini usb cables, 3 ethernet cables, 3 ethernet cables, nuts and bolts, tactile switch, power switch, wire…