“A terrible idea that should never have been tried!” – Ansel Adams, April 2021.
[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/3″][vc_single_image image=”2047″ img_size=”large” onclick=”link_image”][/vc_column][vc_column width=”2/3″][vc_column_text]This image is captured by shooting the film back of a Brownie 620 camera using a Raspberry Pi camera mounted inside the black chamber above the lens just outside the optical path. High gain screen material is used on the image plane to maximize the reflected light. It works, sort of… The current limitations are that it ends up being very low light so in full sun, I have to use the highest ISO available and about a .25 second exposure. Also, since the sensor is not in line, the captured image is skewed and you end up with soft focus on the top and bottom.[/vc_column_text][vc_empty_space height=”100 px”][vc_column_text]First image, un-warped and brightened.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]It’s a fun project, and I’m sure I’ll complete it someday. I am waiting to see if a higher sensitivity chip will become available for the RPI. Ultimately, the best solution would be to manufacture a lens element that would refocus the 120 film size beam onto the small sensor mounted in the back, or a concave mirror on the back that would reflect the image onto the sensor where it is currently mounted (both of these solutions are beyond my pay grade).
This uses a Raspberry Pi Zero W and a small LCD panel, both powered by a Pi Sugar battery. Operating system is diet pi. The camera is operated with the buttons on the LCD panel board and the resulting image is displayed on the screen. Using VNC and a virtual desktop, I also get the live feed directly from the camera to a laptop. The pictures are saved with the raw information and converted to DNG. I designed a 3D printed adapter to mount the electronic components on the body of the camera but I am waiting on more light sensitive solutions to finish the piece.
“Slitscan” is the name given to a photographic technique that uses a specific type of lens using a thin and tall rectangular aperture that is moved horizontally to create an exposure. Instead of exposing the whole film surface at once through an iris, these lenses capture the light over time and across the length of the negative, like a scanner or a rotary printing press. They have typically been used for capturing very wide horizontal perspectives in landscape photography or group photos, as well as to create optical visual effects. With the advent of digital video, this process can be expanded on to generate surprising visuals which exist somewhere between abstraction and representation in a place where they feel both familiar and strange at the same time.
Digital video can be thought of as a cube of data. Each image is a two dimensional plane of pixels with X and Y coordinates, and these image planes are stacked on top of each other like the floors of a skyscraper. In this cube, a frame in the original video is the ‘XY’ plane at height ‘t’. In our building analogy, this would be the floor plan at a specific floor. What we usually think of as a slitscan is the Yt plane for coordinate X, or to continue the skyscraper analogy, a cross sectional slice of the whole height of the building. Ultimately, this cube of data can be processed, dissected, or remixed in arbitrary ways to the point where the name “slitscan” no longer makes sense. Video datagraphy is a more appropriate description of the process: making images from video data.
When we navigate the physical world, we use a constantly changing and always singular perspective to build a mental model representing what is around us beyond what we can directly experience at that moment. Though these models persist through time in our consciousness, we can never experience them holistically. We are bound by the laws of physics and can not ACTUALLY wrap ourselves around them. As a substitute, we look for patterns that can give us some reference as to where we are on that continuum: a heart beat, light patterns, seasons, music, speech… Tracking these linear signals informs our conception of space beyond our current perspective and anchors us on the mysterious expanse of time,
Storytelling similarly spans time and space. It draws narrative arcs that connect specific events lost inside an infinity of places and moments, and provides a scaffolding on which our understanding of the world is conveyed. Even though stories conjure up a god-like perspective above the physical constraints of our human experience which we can never directly experience, they unfurl in a linear manner that gives us glimpses of that higher dimension, like shadows of a 4 dimensional hypercube projected onto the 3 dimensions we inhabit.
The physical principles of traditional photography are similar to the way vision works in that they are limited to a single momentary perspective in space and time. Video datagrams look strange because the way they expose the structures of time is not physically possible for us but they feel familiar because we have intuited from our experience of time what they end up revealing in a static coherent image.
The long awaited part 2 of this blog post has finally arrived! Though I’ve been tinkering on this project for the past two years, I decided to write it up to coincide with the outrageous arrest of 14 year old tinkerer Ahmed Mohamed who was hand cuffed because his teacher thought his electronic clock project looked like a bomb. This binary clock project of mine ended up being a personal electronic circuit design introduction course. You can download all the relevant files here if you want to make your own.<br />I forgot what the original inspiration was but what I’m ending up with is a working binary clock on a custom printed circuit board. Ultimately, this project will involve fiber optics inside polished cement for a unique time piece but we’re not there yet… Here’s what I learned so far:
Bit shifters
A binary clock needs 20 individual blinky things and the arduino has less than that, so I needed to figure out a way to create more individually addressable outputs. The solution I found is a the 74HC595N chip that can turn two inputs into 8. In fact, you can wire them in series and they can provide you with any number of outputs in multiples of 8. I decided to use one to drive the hours display, one for the minutes, and one for the seconds.
There are tons of tutorials for them so it was fairly straight forward to get it working.
Keeping time
While you can make a timer with just an arduino, it is not very accurate and it has no way to keep the clock going if you unplug the power. I used a rtc1307 chip which is designed for just this purpose. It keeps time accurately and uses a small battery to continue keeping time when the power is disconnected.
Again, there is an arduino library available and a good amount of tutorials out there so it wasn’t too hard to test it and incorporate it in the build.
Removing the arduino
Eventually, since I wanted to end up with a single circuit board, I didn’t want to have to plug anyting into an arduino. Once again, the internet is a wonderful resource which allowed me to figure out how put only the arduino components I needed onto a bread board. I can upload the code onto the chip by putting it on an arduino, and then pull it off of there to mount it directly onto the breadboard.
Code
The clock can be set to one of 4 modes: display time, set hours, set minutes or set seconds. There are two buttons. One button toggles between all the different modes, while the other increments the count of the hours, minutes, and seconds when they are in their respective mode.
//Pin connected to ST_CP of 74HC595 const int latchPin = 8; //Pin connected to SH_CP of 74HC595 const int clockPin = 12; ////Pin connected to DS of 74HC595 const int dataPin = 11;
//Pins for setting the time const int button0Pin = 5; const int button1Pin = 6;
// Variables for debounce int button0State; int button1State; int previousButton0State = LOW; int previousButton1State = LOW;
long lastDebounce0Time = 0; // the last time the output pin was toggled long lastDebounce1Time = 0; // the last time the output pin was toggled long debounceDelay = 50; // the debounce time; increase if the output flickers int button0Mode = 0;
// object to communicate with RTC tmElements_t tm;
void setup() { // Setup pins for the shift register pinMode(latchPin, OUTPUT); pinMode(clockPin, OUTPUT); pinMode(dataPin, OUTPUT);
// Setup pins for manual time setting pinMode(button0Pin, INPUT); pinMode(button1Pin, INPUT); }
void loop() {
RTC.read(tm); // button 0 can be in display time, hour set, minute set, and second set modes. // if( button0Mode == 0){ timeDisplayMode(); } else if( button0Mode == 1){ setHourMode(); } else if( button0Mode == 2){ setMinuteMode(); } else if( button0Mode == 3){ setSecondMode(); }
// // mode switching // // Just after a button is pushed or released, there is noise where the value returned is 0/1 random. // debouncing consists of reading the incomming value and, if it has changed, storing the time the change was noticed. // It then continues to check and if after a certain amount of time (delay), it reads the input value and it is still the changed value // noticed before, then it means that an actual state change happened. int reading0 = digitalRead(button0Pin); // this only happens when the input value is different from the last time the value was read if (reading0 != previousButton0State) { lastDebounce0Time = millis(); } // we only enter this loop if the returned value hasn’t changed in a while, which means we are not in the noisy transition if ((millis() – lastDebounce0Time) > debounceDelay) { // if we are in here, it means we got two similar readings. if (reading0 != button0State) { // if the two similar readings we got are different from the stored state, we must have changed button0State = reading0; if(reading0 == HIGH){ button0Mode = (button0Mode+1)%4; } } } previousButton0State = reading0;
// create binary value for each digit of the hour/minute/second number // offset represents how many bits are used for the tens. int setTimeBits(int n, int offset){ int n1 = n%10; int n0 = (n-n1)/10; return n0 | n1<<offset; }
Schematic and board
Once the breadboard was working, I set out to sketch the circuit in Fritzing. While it’s not quite as intimidating as EAGLE cad, I ended up using the latter after running into some limitations with the former (I don’t remember what they were). There was a lot for me to learn there but, in the end, it’s conceptually pretty simple: all the pieces have to be connected together correctly. It’s just another way to represent the circuit. Once that was done, I started with the board. I laid out all the components and let the software automatically figure out how to create the correct traces. One cool thing is that if you choose the correct electronic components in the software, all the size and shapes are properly represented when you are designing the board. It’s a huge pain in the ass to sort through all the libraries of components, though, specially when you don’t know what all the specs mean.
The eagle cad files are included in the download file at the top of this page.
Manufacturing the board
Super simple: just go online and find a service that will manufacture them. For this project, I used oshpark.com and dirtypcbs.com, which allow you to upload your designs right out of EAGLE cad. After a few weeks, you get your board in the mail, ready for you to solder the components on. I order my components from mouser.com, which allows you to save a collection of various components into a project specific list. Again, finding the right components amongst the tens of thousands they have available is really time consuming and annoying. But now, I have my parts list so I never have to go through that again if I want to solder up new versions of the board. The list of parts in included in the download file at the top of this page.
The ugly truth
If you were paying attention, you no doubt noticed in the preceding paragraph that I used two board manufacturers. That is because the first board layouts I had printed actually had shorts. I suppose it’s probably not that uncommon, but it’s really frustrating to upload your designs, order the boards, wait for them to be delivered, spend all this time soldering the board to find out it doesn’t work, and then it can be challenging to figure out where the wires are getting crossed. In the end, I spent about $150 on boards and parts that ended up not working. I guess that’s the cost of learning… My first two board designs were ordered through Oshpark, and the minimum order was 3, for about $50. The third order was done on dirtypcbs and was $25 for 10 boards. They feel cheaper and took forever to get delivered but you sure can’t beat the price.