This month, I am deciding to make bigger and heavier pots, but still keeping with the monolithic shapes. I was spurred into creative action by a cardboard tube I saw in a trash pile at work. I noticed it and a lightbulb flashed in my head; I immediately grabbed it, knowing exactly what I was going to do with it. The world is full of gifts indeed and I love the process of finding sudden and unexpected inspiration at random time. All you have to do is remain open, hone your discernment skills, and channel whatever comes your way. Also, I’m experimenting with black cement coloring and black river rocks on my old 6 inch round model.
Category Archives: Stuff That’s Arty
Video Triptych
Building a synced video tryptic with Raspberry Pies and laptop screens
A filmmaker friend, Natasha Maidoff, approached me to design a display device that would allow her to display a video triptych at the 2015 Venice Art Walk. The idea was to create a frame that would contain all the components necessary to play three different videos on three separate screens in perfect sync. It should be simple and should not need to be connected to a bunch of cables or computers. The goal was to hang it, plug it, turn it on and forget it… Oh, and did I mention it should not cost an arm and a leg?
Based on my previous tinkering activities, I felt that this project was within my reach and so I started researching parts. My first order was for 3 Raspberry Pi 2 mini computers, 3 replacement laptop LCD screens and the 3 driver boards they needed to be connected to the Pi’s HDMI output. Along with that, I got the various necessary AC adapters to power all this mess.
Framing and mounting the electronics
Networking the three Raspberry Pi
For this to work reliably and predictably, I changed the Pies’ IP address to be static, and while I was there, I changed the hostnames, and set it up to automatically log in to the command line without loading the desktop interface.
To set up the static IP to 192.168.11.45, edit /etc/network/interfaces and replace:
iface wlan0 inet dhcp
with:
address 192.168.11.45
netmask 255.255.255.0
gateway 192.168.11.1
network 192.168.11.1
broadcast 192.168.11.255
To setup the hostname, replace it to your desired name in /etc/hostname, and also list all the other hostnames with their static IPs in /etc/hosts.
Lastly, to log directly into the command line interface without having to enter a username and password, edit the file /etc/inittab and replace the line:
1:2345:respawn:/sbin/getty 115200 tty1
with:
1:2345:respawn:/bin/login -f pi tty1 /dev/tty1 2>&1
Getting rid of warts
Wire time… AC power cord into 12V power supply, 3 sets of wires with DC power barrel jacks connecting the 12V output to each of the monitors, 1 set of wires connecting to the 12V to 5V step down converter. 5V output of the down converter to the raspberry pies and the ethernet hub. One of the challenges here was trying to solder wire to the micro USB plugs I had bought. It’s all way too small and I ended up with an ugly heap, kind f like Jeff Goldblum and the fly in “The Fly”, except with melted lead and plastic. In the end, I just bought some USB cables with the proper connection, cut them and rewired them to fit my needs.
The software
Auto start
I needed the movies to start automatically when the machines were done booting, so I modified the /home/pi/.bashrc file to run my main script whenever it was being accessed at the end of a boot sequence (I didn’t want to launch my movie playback script every time I logged in from another machine). I look for the $L1 environment variable that gets to “tty1” only when logging to the console from the boot sequence. I added the following at the top of .bashrc:
sudo /home/pi/video_player.py
fi
Movie playback
For movie playback, I made the obvious choice and used omxplayer which is custom written for the Raspberry Pi hardware and can play full 30fps HD video from the GPU, and there’s even a crude little library called pyomxplayer that allows control from python. In order to get the pyomxplayer library to run, I had to install the pexpect python library which allows it script to spawn and control the omxplayer process. Also, pyomxplayer tries to parse the text output by omxplayer but it seems like that part of the code has changed and causes the script to fail and exit so I had to remove that part of the code. I also added a function to allow me to rewind the movie. As soon as my script starts, omxplayer loads the appropriate movie file and pauses at the beginning.
Syncing the movies
As for syncing the start of the three movies, I used pyOSC to have the machines automatically establish a connection when they boot up and unpause the movies at the same instant when all three machines are ready. The basic process goes like this: I designate one machine to be the master and the two others to be slaves. When the master boots up, it first listens for a signal from each the slaves, and stays in this mode until it has heard from both. On their end, the slaves’ first action during launch is to send a signal to the master. As soon as the master has heard from both slaves, it tells the slaves to switch to a state where they listen to the master for commands. At this point, the master unpauses the movie it previously loaded and tells the slaves to do the same with theirs. Since omxplayer has no looping function I could find that worked for me, I have the master wait for the length of the movie and then rewind to movies to the beginning and start them playing over again.
Playing back from the RAM disk
In order to avoid continually reading from the SD card, I created a ram disk so my script could copy the movie file to it and allow omxplayer to play back from there.
I created the directory for the ram disk’s mount point:
sudo mkdir /var/ramdisk
and added the following to the /etc/fstab file:
ramdisk /var/ramdisk tmpfs nodev,nosuid,size=500M 0 0
The Raspberry Pi 2 comes with 1 GB of ram so I used half of it for the drive, which left plenty for the OS to run.
Timing issues
Fundamentally, pyomxplayer and the pexpect approach it uses is an ingenious but somewhat hacky way to control the process and it took me a long time to get everything to work properly. I found that if my script sent commands to omxplayer too fast, omxplayer would miss the command. I had to put a bunch of sleep statements in my code to pause and allow enough time for commands to get properly “heard” by the omxplayer process. Also, I added a long pause right after the movie is first loaded into omxplayer to make sure any residual and proc intensive boot process has a chance to finish and does not interfere with the responsiveness of each machine. It’s far from a robust setup; It’s basically a script that tells three machines to press the “play” button at the same time. Ultimately, though, I seemed to be able to get the movies to sync well enough across all three machines. Not guaranteed to be frame perfect every time but probably within one or two frames on a reliable basis. Ultimately, I would love for someone to write a true python API for omxplayer.
Powering off and rebooting through GPIO
Since the frame will be fully autonomous, it will not have a keyboard to allow an operator to properly power off the Raspberry Pies. This is a challenge because if you directly unplug the power to turn them off, the SD card will most likely get corrupted and you will not be able to reboot the machine. So, I needed to setup a simple way to cleanly shutdown the machines before turning the power off. I also wanted to be able to reboot if somehow something unexpected happened that threw the machine out of sync. So, I connected the master machine to a couple of switches and set the script up to power off or reboot if the GPIO interface detects a specific button press.
The code
Here is the code. You will need to install pyomxplayer, pexpect, and pyOSC. Also, depending on your version of omxplayer, you may need to modify pyomxplayer because of the way it attempts to parse omxplayer’s output. This script works for both the master and the slave based on the hostname. Here is the basic process:
When all machines boot up, they copy the relevant media to the ram drive, and launch it in omxplayer in a paused state
The slaves then go into a startup loop sending a “ready” address to the master until they hear back.
The master goes into a loop to check if both slaves are ready, listening for a “ready” address from each.
When the master determines that both slaves are ready, it sends a “listen” address to the slaves and the slaves come out of their startup loop.
From here on out, the master controls the slaves through OSC to unpause/pause/rewind the movies indefinitely.
The machines either reboot or shut down when the specific GPIO pin gets set on the master.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 |
#!/usr/bin/python import OSC import threading, socket, shutil from pyomxplayer import OMXPlayer from time import sleep ''' ''' hosts = {'master':'192.168.11.45', 'slave1':'192.168.11.46', 'slave2':'192.168.11.47'} movies = {'master':'/home/pi/media/movie_lf.mov', 'slave1':'/home/pi/media/movie_cn.mov', 'slave2':'/home/pi/media/movie_rt.mov'} movieLength = 60*5 hostname = socket.gethostname() print "copying %s to /var/ramdisk" % movies[hostname] shutil.copy(movies[hostname], "/var/ramdisk/") movies[hostname] = movies[hostname].replace('/home/pi/media', '/var/ramdisk') print "playing movie from %s" % movies[hostname] omx = OMXPlayer(movies[hostname]) sleep(5) omx.toggle_pause() sleep(1) omx.rewind() sleep(1) def reboot(): command = "/usr/bin/sudo /sbin/shutdown -r now" import subprocess process = subprocess.Popen(command.split(), stdout=subprocess.PIPE) output = process.communicate()[0] print output def poweroff(): command = "/usr/bin/sudo /sbin/shutdown -h now" import subprocess process = subprocess.Popen(command.split(), stdout=subprocess.PIPE) output = process.communicate()[0] print output if hostname == 'master': def gpio_check(): import RPi.GPIO as GPIO GPIO.setmode(GPIO.BCM) GPIO.setup(23, GPIO.IN, pull_up_down=GPIO.PUD_UP) GPIO.setup(24, GPIO.IN, pull_up_down=GPIO.PUD_UP) while True: input_state = GPIO.input(23) if input_state == False: print('Detected a reboot request: Button 23 Pressed') send_reboot(slaves['slave1'][2]) send_reboot(slaves['slave2'][2]) sleep(.3) reboot() input_state = GPIO.input(24) if input_state == False: print('Detected a poweroff request: Button 24 Pressed') send_poweroff(slaves['slave1'][2]) send_poweroff(slaves['slave2'][2]) print('Button 24 Pressed') sleep(.3) poweroff() sleep(.1) gpio_thread = threading.Thread(target=gpio_check) gpio_thread.start() # send listen def send_listen(slaveServer): print "querying slave", slaveServer msg = OSC.OSCMessage() msg.setAddress("/listen") # set OSC address msg.append(" the master.\nThe master is pausing for 20 seconds.") # int slaveServer.send(msg) # send it! # send play command def send_play(slaveServer): print "sending play to", slaveServer msg = OSC.OSCMessage() msg.setAddress("/play") # set OSC address msg.append("the master") # int slaveServer.send(msg) # send it! # send toggle_pause command def send_toggle_pause(slaveServer): print "sending toggle_pause to", slaveServer msg = OSC.OSCMessage() msg.setAddress("/toggle_pause") # set OSC address msg.append("the master") # int slaveServer.send(msg) # send it! # send rewind command def send_rewind(slaveServer): print "sending rewind to", slaveServer msg = OSC.OSCMessage() msg.setAddress("/rewind") # set OSC address msg.append("the master") # int slaveServer.send(msg) # send it! # send reboot command def send_reboot(slaveServer): print "sending reboot to", slaveServer msg = OSC.OSCMessage() msg.setAddress("/reboot") # set OSC address msg.append("the master") # int slaveServer.send(msg) # send it! # send poweroff command def send_poweroff(slaveServer): print "sending poweroff to", slaveServer msg = OSC.OSCMessage() msg.setAddress("/poweroff") # set OSC address msg.append("the master") # int slaveServer.send(msg) # send it! # handler for ready address def ready_handler(addr, tags, stuff, source): if not slaves[stuff[0]][0]: slaves[stuff[0]][0] = True print "setting %s to ready" % stuff[0] # setup clients to send messages to slavesReady = False c1 = OSC.OSCClient() c2 = OSC.OSCClient() slaves = {'slave1':[False, (hosts['slave1'], 9000), c1] , 'slave2':[False, (hosts['slave2'], 9000), c2] } # set up self to receive messages receive_address = hosts['master'], 9000 s = OSC.OSCServer(receive_address) # basic s.addDefaultHandlers() s.addMsgHandler("/ready", ready_handler) # adding our function # Start OSCServer print "\nStarting OSCServer. Use ctrl-C to quit." st = threading.Thread( target = s.serve_forever ) st.start() # set up clients to send messages to slaves['slave1'][2].connect( slaves['slave1'][1] ) # set the address for all following messages slaves['slave2'][2].connect( slaves['slave2'][1] ) # set the address for all following messages ######### # establish communication print "Master is waiting to hear from the slaves." # The master waits until both slaves are ready while not slavesReady: sleep(.01) if slaves['slave1'][0] and slaves['slave2'][0]: slavesReady = True print "The master has heard from both slaves" # The master tells the slaves to listen send_listen(slaves['slave1'][2]) send_listen(slaves['slave2'][2]) print "The master has told the slaves to listen" print "Pausing for 20 seconds" # catch our breath sleep(20) ######### # media control # we go into an infinite loop where we # unpause, wait for the movie length # pause, wait, rewind, wait, unpause print "entering main loop" while True: send_toggle_pause(slaves['slave1'][2]) send_toggle_pause(slaves['slave2'][2]) omx.toggle_pause() sleep(movieLength) send_toggle_pause(slaves['slave1'][2]) send_toggle_pause(slaves['slave2'][2]) omx.toggle_pause() sleep(2) send_rewind(slaves['slave1'][2]) send_rewind(slaves['slave2'][2]) omx.rewind() sleep(2) else: thisName = hostname thisIP = hosts[hostname], 9000 masterStatus = {'awake':[False], 'play':[False]} masterAddress = hosts['master'], 9000 def send_ready(c): msg = OSC.OSCMessage() msg.setAddress("/ready") # set OSC address msg.append(thisName) # int try: c.send(msg) except: pass def listen_handler(add, tags, stuff, source): print "I was told to listen by%s" % stuff[0] masterStatus['awake'][0] = True def play_handler(add, tags, stuff, source): print "I was told to play by %s" % stuff[0] masterStatus['play'][0] = True def toggle_pause_handler(add, tags, stuff, source): print "I was told to toggle_pause by %s" % stuff[0] omx.toggle_pause() def rewind_handler(add, tags, stuff, source): print "I was told to rewind by %s" % stuff[0] omx.rewind() masterStatus['awake']=False def reboot_handler(add, tags, stuff, source): print "I was told to reboot by %s" % stuff[0] reboot() def poweroff_handler(add, tags, stuff, source): print "I was told to poweroff by %s" % stuff[0] poweroff() ########### # create a client to send messages to master c = OSC.OSCClient() c.connect( masterAddress ) ########### # listen to messages from master receive_address = thisIP s = OSC.OSCServer(receive_address) # basic # define handlers s.addDefaultHandlers() s.addMsgHandler("/listen", listen_handler) s.addMsgHandler("/play", play_handler) s.addMsgHandler("/toggle_pause", toggle_pause_handler) s.addMsgHandler("/rewind", rewind_handler) s.addMsgHandler("/reboot", reboot_handler) s.addMsgHandler("/poweroff", poweroff_handler) # Start OSCServer print "\nStarting OSCServer. Use ctrl-C to quit." st = threading.Thread( target = s.serve_forever ) st.start() print "%s connecting to master." % hostname while True: #########i## # keep sending ready signals until master sends a message # on the /listen address which gets us out of this loop while not masterStatus['awake'][0]: sleep(.01) send_ready(c) ########## # once the master has taken control, we do nothing # and let the master drive playback through handlers |
Framing
The last few details consisted of mounting the tactile switches connected to the GPIO pins for shutting down or rebooting the machines, soldering on a mini jack connected to the master Raspberry Pi and provide audio from the outside, and lastly, installing a main power switch for the whole unit.
Hanging
Here it is… We’re very happy! The work looks great. After seeing Natasha’s piece running on it in perfect sync, we’re all very inspired to work with this tryptic format. There are so many ideas to explore. Since we’re hoping for this to sell at the auction for the Venice Family Clinic, we’re already planning to build another one for us to play with.
Parts list
In case you are foolish enough to go down the same route I did, here’s what I used:
3 raspberry pi 2
3 LTN156AT02-D02 Samsung 15.6″ WXGA replacement screens (ebay)
3 HDMI+DVI+VGA+Audio Controller Board Driver Kit for LTN156AT02 (ebay)
1 D-Link – 5-Port 10/100 Mbps Fast Ethernet Switch
DC 12V 10A Regulated Transformer Power Supply (typically used for LED strips)
DC-DC 12V/ 24V to 5V 10A Converter Step Down Regulator Module
Aluminum, mini usb cables, 3 ethernet cables, 3 ethernet cables, nuts and bolts, tactile switch, power switch, wire…
I’m a pot dealer in Venice Beach
These pots, or ones that look like it, are available for sale, ranging from $30 to $100. I hand polish them myself to a nice smooth finish with a cement grinder that reveals the pattern of the aggregate, and for the amount of labor I put in them, I really can’t afford to have you buy them, but hey: my house can only accommodate so many of them so I have to somehow get rid of them. I do not have a store set up but leave a message or email me and I’m sure we can figure out something.
From broken laptop to cool utility monitor
Broken things are solutions looking for problems. They hold magic undiscovered potential. I hate throwing away broken things because I’m never sure I won’t be able to later re-purpose some part of them into something else. On a seemingly unrelated note, I’m often in need of plugging some random computer or video device in to a screen for testing and, up until now, I have had to swap cables on my desktop monitors or pull out some bulky crappy old monitor from the garage and find some temporary place for it while I use it. What I really want is a small lightweight monitor I can easily access and occasionally plug a Raspberry Pi or my linux server into, something that’s closer in size to a laptop screen than a big bulky desktop monitor and that will not take up desk space.
Cue in broken hardware
As luck would have it, my stupid cat likes to sleep on laptops. They provide a nice warm place and a soothing fan sound. Usually, the only repercussion of this fact is that I sometimes wake up in the morning with hair on the keyboard and the Google helpfully informing me that
In one particular instance however, I was using a hackintoshed Asus 1201N that was clearly not designed to support the weight of a house cat; I know because it never woke up from its night of feline suffocation. I took it apart to see what I could find but never did find the needle in that haystack; probably a shorted motherboard… In the closet of broken stuff it went.
Displays
I’m always thinking about creating different displays. I’m pretty bored with the standard monitor design and I’m always interested in challenges to the rectangle-in-a-sleek-shiny-enclosure status-quo which led me to learn that I could probably find hardware compatible with my broken laptop’s LCD panel. I took the laptop apart, got the part number of the LCD panel and eventually found the right board on ebay for about $30 (I searched for “HSD121PHW1 controller”, HSD121PHW1 being the panel’s model number). Two weeks later, the padded envelope with the telltale Chinese characters on it showed up in my mail mailbox and I immediately plugged it in to test it. Lo and behold, it worked!
From parts to a thing
Okay, so now, I actually have the parts I need to make that small monitor I’ve been wanting: a small LCD panel and a driver board I can plug any HDMI or DVI display into. The next step is putting it together in a cool, cheap and convenient way. I had been peripherally interested by Plexiglas for another project so I decided to try it as a mounting surface for this. I measured the width of the screen and driver board, as well as the height of the screen with and without the adjustment buttons, and got a couple pieces cut at my local plastic store (Santa Monica Plastics). I drilled some holes, mounted the various parts on the back using standoffs, and sandwiched them by screwing in the smaller piece of plexi in the front using longer standoffs.
Binary Clock, Part 1
Binary clocks are a family of ubiquitous geek toys which display each digit of the time using binary notation. If you do a search for "binary clock" in Google, you will see a nearly infinite number of implementations. The reason I like the whole genre is that a binary clock is an electronic system that claims to exist for the purpose of conveying information when in fact it's all about finding an obtuse excuse to make something blink. The delight of it is that it's completely impractical to read but the geeks don't care: the coolness of the blinky lights joyfully trumps any need to be practical.
Speaking of geeks, I've been wanting to look into this whole Angular.js framework thingy all those overly bearded tech-hipsters are talking about (when they are not crafting their own cheese or riding fixed wheeled bikes), and so I'm using this blog post as an excuse to program simple apps that illustrate the process I am talking about using Angular.
Displaying numbers in binary
You can see below how to calculate the value a binary number. It's pretty straight forward: a particular place can only be 0 or 1 and once you increment above that, you loop back to 0, add one onto the digit to the left, and if THAT one is already 1, it also loops back to 0 and the behavior ripples leftward.
binary number | {{get8()}} | {{get4()}} | {{get2()}} | {{get1()}} | |||||
x | x | x | x | ||||||
place | 8 | 4 | 2 | 1 | |||||
= | = | = | = | ||||||
total | {{8*get8()}} | + | {{4*get4()}} | + | {{2*get2()}} | + | {{1*get1()}} | = | {{count}} |
Notice that the highest number we can represent a maximum of 16 values with 4 b(inary dig)its. 8 bits can represent 256, 10 bits, 1024, and so on...
Displaying the time in binary
{{d}} |
From math to art
Not comfortable with numbers? Replace the 1's with teale and the 0's with burn sienna, on a background of deep emerald. Bam! Suddenly, you've become an artist, conjuring a playful visual dance of colors on an abstract rhythmic canvas. You're a fucking genius!
{{d}} |
(Note that since the highest number for the hours is 23, the first column never has to go above 2 and we really only need the bottom two places to represent that number. Similarly, the value of minutes and seconds only goes to 59 so the third and fifth columns only to represent the value 5, which can be done with only 3 bits.)
What's next?
Stay tuned... Part 2 describes how to build your very own binary clock, using LED's, chips, electricity, obsession, and patience.
Dexter’s Head
Quick little test of photogrammetry software. Nothing revolutionary here… It’s simply something that I’ve been interested in for a long time and haven’t gotten around to working with. Until now, that is, since I have been capturing and modeling real environments for live projections.
From this:
to this:
Click and drag in the window to tumble the geometry.
Self Illuminating Responsive Tintype Frame
The area of a tintype image that gets exposed to the most light essentially becomes a pure silver coating. While the reflective properties of silver make the final image a bit darker than traditional photographic prints, it also gives it a really compelling silvery metallic reflective feel. As a way to highlight those unique qualities, I decided to experiment with hiding LEDs in a box frame to illuminate the image from inside. I also wanted to find a way to intensify the lighting and make the image come alive as viewers got close to it.
Framed!
I inherited some antique oak wood floor pieces and made quick work of it on the old chop saw.
Controls
The next step was to figure out the kind of sensor needed to make the lights come on as a viewer approached the frame. I figured the sensor required would need a range of at least a few meters and be able to return the specific distance to the closest obstructing object. I am not a distance sensor expert so I used the internet to help. In the end, I settled for a Maxsonar EZ0 LV. It’s cheap, it’s got a range of a few meters and it’s got both serial, analog and PWM outputs. I hooked it up to a teensy board and confirmed the sensor was returning appropriate distance values. On the lighting front, I was planning on controlling the brightness of the LED string with the teensy, but since the LED string requires a 12V supply and the teensy outputs only 5V, I used a MOSFET driven by the teensy’s output to modulate the LED’s power supply.
Code
God, I love the internet!!! I read somewhere that the sensor’s PWM signal was much cleaner so I started with that but I eventually found out that it also is much much slower and wasn’t able to keep up with the rate of change I was looking for. In the end, I used the analog signal and tried to filter it as best I could.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 |
/* Using the Analog signal from a Maxsonar LV EZ0 to drive a 12V LED strip through a MOSFET. Take 20 samples, sort and select the median value. */ const int anPin = 9; int anVolt, cm; const int samples = 20; int ranges[samples]; int highestReading; float time; float previousIntensity; float maxDist; float minDist; int minVal, maxVal; int mosfetPin = 9; // the pin that the MOSFET is attached to void setup() { // declare pin 9 to be an output: pinMode(led, OUTPUT); Serial.begin(9600); time = 0; maxDist = 450; minDist = 10; minVal = 5; maxVal = 255; previousIntensity = 0; } void loop(){ // print time spent since last loop Serial.println(millis()-time); time = millis(); // sample sensor value for(int i = 0; i < samples ; i++) { anVolt = analogRead(anPin)/2; ranges[i] = anVolt; delayMicroseconds(50); } isort(ranges,samples); // convert to centimeters Serial.println(samples/2); cm = 2.54 * ranges[samples/2]; // shape the curve of the result float intensity = (cm - minDist)/(maxDist - minDist); intensity = 1-constrain(intensity,0,1); intensity = intensity*intensity; intensity = previousIntensity*.995 + intensity*.005; previousIntensity = intensity; // set the brightness of pin 9: float val= intensity*(maxVal-minVal)+minVal; analogWrite(mosfetPin, val); } //Sorting function void isort(int *a, int n){ // *a is an array pointer function for (int i = 1; i < n; ++i) { int j = a[i]; int k; for (k = i - 1; (k >= 0) && (j < a[k]); k--) { a[k + 1] = a[k]; } a[k + 1] = j; } } |
Next Steps
Once I get my paws on that Arduino Nano board, I can rework my circuit and get the soldering iron out to make my electronics more permanent. I have also ordered a HC-SR04 Distance Sensor to see how it compares to the Maxsonar in terms of accuracy, speed and noise. Also, I need to make the frame a little bit deeper so the light can spread out a bit further across the image.
Which one is cooler?
The ominous cyclopean look or the cute and friendly Wall-E look?
Braised Tangerine Tintype Beef Tongue
Prologue
In which we explore the process of exploration, and we take things and make other things from them for the purpose of being joyful in the process of exploration, and mix things up in a way that creates little mental explosions of aliveness, for no other purpose than to celebrate the uncircumscribeable, with a nod to Eraserhead and Quinn Martin. Oh, and also, pictures and tacos come out in the end.
Act I
Ingredients:
-Chard
-One cake plate
-One 8×10 camera
-One 8×10 black enameled aluminum plate
-One studio with some lights
-Chemistry: collodion, silver nitrate, developer and fixer
-Two heads of garlic
-Three chipotle chilies
-Half a teaspoon oregano
-Eight bay leaves
-Five tangerines
-Salt and pepper to taste
Act II
-Take the beef tongue, the camera and the chemistry into the studio, and lay out the beef tongue onto the cake plate. Decorate the base with some chard leaves and carefully place your lights around the arrangement to highlight the features of the tongue.
-Take some pictures of it and adjust as needed.
-While you’re at it and in the studio, take more pictures of other stuff you happened to bring along.
-After a long day shooting, bring the tongue home, place it in the fridge and forget about it.
-Leave in fridge for four or five days until your wife starts asking what the hell you’re intending to do with that disgusting thing.
-Decide maybe you should make some decisions regarding what the next step is for the tongue.
-Look at a bunch of recipes online, decide to not follow any of them and wing it instead.
Act III
-Put the tongue in a large pot and cover with six quarts of water. Add one of the onion, one of the heads of garlic, and two of the chilies, all chopped. Add the bay leaves, salt and pepper and boil for 4 hours.
-Let the tongue cool and peel off the skin. Cut it in half and slice off a piece a decide it’s kind of weird and probably not quite cooked enough. Pretend you’re cool and that all the strange gristly fatty bits don’t freak you out at all.
-Decide it can be better…
This blog post is starting to be a little text heavy and we know that can be a problem with today’s typical 4 minute attention span reader. Here’s something completely random and unrelated that moves, for the purpose of keeping overactive brains engaged. It’s even interactive, so when you click in it, it does interactive multimedia!
Act IV
-Cut up the tongue in about 10 chunks and saute the pieces in olive oil in a dutch oven. Add salt, oregano, one chopped onion, peeled garlic cloves from the second head, and finely chopped chili.
-Juice tangerine, remove seeds, and add the juice and the peel to the mix.
-Cover the dutch oven and place in oven at 325 for 4 hours, checking it occasionally to make sure it’s not getting too dry.
-Once it’s done, cut it into 1/2 inch slices and sprinkle with chopped white onion and cilantro, and squeeze lime juice on it.
-Serve it to your family and watch them give it the old college try, claim that they really like it but say they’re not really hungry.
-Eat a lot of it yourself because it’s pretty freakin’ good, actually.
-Eat tacos de lengua at lunch for the next 5 days.
-Have your family eventually admit to you they thought it was weird and disgusting, and please never make it again.
-Blog about it later because if you don’t blog about, how can you know for sure it actually happened?
Epilog
In which one has better figure out what the hell to do with the cow’s feet that were purchased with the tongue, because they’s really starting to smell. I hear tendon soup is a thing… Thanks to Tintypebooth for the help with the tintypes.
Building an Ultra Large Format Camera, Part 1
The basic elements of a camera
One of the cool things about a camera is that at its core, it’s very simple. All you need is a lens that focuses light and a surface that this light gets focused on. The process of bending light with lenses to focus on a surface was first explored during the Renaissance with the camera obscura. It wasn’t until the 19th century that people figured out how to keep a record of how much light hit a particular area of that surface. Anyway, to make a camera, all you really need is a lens and a surface for the light to hit, and to create an image from a camera, you either need to trace the image you see projected on the surface or you need some kind of coating on the surface that reacts to light.
Is that a large lens in your pocket?
After giving me a taste of 4×5 tintypes, my buddy at Tintypebooth showed me some large old lenses from photographic systems used in spy planes that he had bought on ebay. These things are serious! They are very heavy and the glass is super thick; there is just something massive about them, and when you hold one and feel its weight, you can’t help but be awed by their image making potential and you get possessed by an urge to unlock that potential. He pitched the idea of building a ultra large format camera with one of them, a little “Kodak Aero-Ektar 24” 610mm” number, weighing in at just over 10 lbs and sporting a few scratches I like to think were caused by the strafing of some of the Luftwaffe’s last Messerschmitts.
Let’s decode those numbers, shall we? The 24” is the size of the image plane and 610mm (also 24”) is the focal length.Based on my previous post about lenses, it means that at its shortest, this camera will be a little over two feet long. At four feet of distance between the lens and the plane, the image on the focal plane will be the same size as the subject in focus four feet from the lens, and six feet will create an image bigger than reality. The film holder will need to accommodate plates that will be 24 inches on one side. I may need a bigger car…
I need a plan
Patience is a virtue I’ve always been in somewhat limited supply of. We have this killer lens… What’s the fastest and cheapest way we can get a picture out of it? Sure, we can design a fancy camera with a lot of bells and whistles but it would take a long time and cost a pretty penny. For now, I just need a bare bones proof of concept prototype. I’ll focus on the basic pieces and see if I can build it myself. I’ll build the back out of oak and do all the struts and supports using aluminum channels. The animated image above is a Maya model I built to scale that shows how all the pieces need to fit together. It doesn’t look too difficult, does it? One thing not shown in the animation is that the back that will hold the plate will be interchangeable with another back that will have the ground glass necessary to focus. The process will be as follows: first you will use the ground glass back to focus, slide it out, and then slide in the film back to load your camera.
Baby got back
Kim Kardashian’s got nothing on this bad boy! I built this 24″x20″ film back over the past couple weeks. I’m not a great builder and my Home Depot tools are a bit wobbly so I wouldn’t call it fine craftsmanship but it will hopefully do the trick. Oh, and did I forget to mention it’s not exactly square? Yeah… Let’s just say it’s square enough. It’s made from 1″x2″ and 1/4″x2″ red oak lumber which I routed to get the insets. It will make a great example when we eventually hire a finish carpenter for the next fancy version of the camera. Here are some pictures of the various pieces it’s made of (you can also see that I like to wear my slippers when I take pictures of my handy work).
More to come…
Here are the steps that come next and will be documented in a hopefully not too distant future.
- I already bought the aluminum extrusions that are necessary to build the film back support, the lens plate holder, and the rails. I will need to learn how to properly drill in aluminum and figure out how to connect all the pieces. (anyone in Venice with a drill press?)
- I will built the lens plate, mount the lens on the plate, and mount the plate on the rails.
- Last will be creating the bellows. Not too sure how that will work but what the hell! We’ve got a few ideas. I’m sure we’ll figure something out.
See? It’s basically like it’s done already…
The three main attributes of a camera lens
If you ever want to embark on the foolish pursuit of building a camera, you will need to understand how lenses work. The three attribute that control the behavior of a basic camera lenses are focal length, format and aperture.
Focal length
The focal length is the distance from the lens at which infinite rays converge. For example, if you have a 50mm lens, infinity is in focus when the plane of the image is 50 millimeters away from the lens. A lens’s focal length is a factor of how much it bends the light. The more the bend, the closer to the lens the rays converge and the smaller the projected image. This, in turn, means that the focal length can also be used as an indication of the magnification of the image.
Format
The format represents the intended size of the image plane that the lens is designed to project onto, such as 35mm or 4×5. The combination of the focal length and format determine the lens’s field of view. This is why a lens with the same focal length gives you a different amount of magnification on different formats. (illustration)
Aperture
The aperture gives a measure in f-stops of how “fast” the lens is. It is often thought of as the size of the opening in the lens, but in photography, it actually is the ratio of the focal length over the diameter of the lens opening. This has the advantage of remaining proportionally equal across the different sizes of photographic systems. If you use the same f-stop in a tiny phone camera and a big SLR using the same ISO setting, the same shutter speed will expose both images similarly. As the size of the opening increases, more light gets in but the thinner the focal plane is.
Finding the focal plane
The last relevant piece of information is the formula that determines the distance of the focal plane to the lens for rays that are closer to the camera than infinity. The relationship between the distance of an object to the lens (S1), and the distance of the lens to the focal plane of that object(S2) is defined by this formula:
1/S1 + 1/S2 = 1/focal_length
Solving for S1, this becomes:
S1 = (S2 * focal_length)/(focal_length - S2)
If you plug in your own numbers, you will notice that the closer your object is to the lens, the farther away the focal plane will be from the lens.
Useful python functions
# Given a specific focal length and distance to subject, how close to the lens does the image plane converge?
def distanceOfFocusPlaneToLens(distanceToSubject, focalLength):
v= (focalLength*distanceToSubject)/(distanceToSubject-focalLength)
print ("when subject is at %s the focal plane is %s from the lens" % (distanceToSubject, v))
#FOV calculator
def FOVFromFocalAndAperture(focal, aperture):
return math.degrees(2*math.atan(aperture/(focal * 2)))