A couple of months ago, Milly started having beam issues. At the time it seemed like emitter trouble. New emitter modules run about $3k, so I took the opportunity to look into manufacturing my own.
But that’s a long story for another time.
The short version is that I’ve learned a lot in the last couple of months:
Cold cathode tungsten emitter tips are really, really tiny. I knew that of course, but you don’t truly have an appreciation for something until you try to make one.
Spot welding tungsten is harder than you might think. It has the highest melting point of any element (3422 C) and gets quite brittle after heating.
Before jumping right into emitter maintenance, be sure to check all of your fuses.
In the end, it turned out that I had blown two fuses in the electromagnetic lensing power supply. This was the cause of the beam trouble, not the emitter itself.
Why two fuses? This circuit uses two 10A fuses in parallel. Each half is supposed to carry 7A.
Why two fuses in parallel instead of a single 20A fuse? I have no idea. The original manufacturer thought it was a great idea. But with fuses in parallel, whenever one blows the other one does too, often in a spectacular fashion.
After changing the fuses I decided to put the original emitter back in place. One 24 hour bake later, she’s back online.
She’s not quite 100% yet… There’s a little trouble with the noise cancelling pre-amp, and I need to take the time to properly realign the column. But thankfully she’s up and making images again.
More on my DIY emitter adventure in a future post.
I recently picked up an AmScope SE400Z inspection scope. It’s a handy desktop microscope that sells for under $200. The large area under the objective lens leaves plenty of room to work, and the 10-20x magnification is plenty for most of my needs.
While they do supply some nice looking USB eyepiece cameras, the price is a little high considering that they need a laptop to function.
I happened to have a Raspberry Pi camera lying around, and thought that it might be handy to turn the AmScope into a digital scope.
The Raspberry Pi has a plastic case that includes a flush mount for the camera. It is mounted directly to the eyepiece with some high-tech laser cut plywood and gorilla tape.
I use it with a cheap WiFi dongle so it has connectivity wherever I happen to need it in the shop. It runs raspistill in full screen mode, with HDMI feeding to an old monitor. The USB keyboard makes taking a photo as easy as hitting enter, though I’m considering making a simple button / foot switch for that. Photos are automatically sync’d to the network with BitTorrent Sync as they’re taken.
With the 5 megapixels provided by the camera, you end up with an effective zoom of about 50x.
I recently posted my first batch of photos from Milly. While I am happy with her beam performance, I was dissatisfied with the digital photo quality.
The inset NTSC image was taken with a USB frame grabber on the CRT port. The bigger image was taken not with a $1k data acquisition module, but with an audio cable, a resistor, and a sound card.
Analog to digital
Milly is a JEOL JSM-6320F, an instrument from another era. That F is important. It means she uses a field emission gun rather than a thermionic filament (like Meryl). This gives you significantly more control over the beam current, and ultimately, brighter pictures at deeper magnification.
But Milly is predominantly an analog device. While she sports “digital storage”, the on-board memory can only hold four frames, which are lost when the scope is powered off. There is a SCSI option for a 30MB hard drive, but I haven’t had any luck getting it to recognize a drive. According to one forum post I found from 1993, the files would be in an “obscure and difficult” format even if I could read them.
So to get digital photos from her, I could either take crappy pictures of the screen, or put a cheap NTSC frame grabber on her CRT mirror port (tip of the hat to Glen MacDonald from that same post for pointing out which port to use!) This makes taking photos really easy, but it limits the resolution to NTSC (about 500 lines).
At first I took the second route, and ended up with a bunch of pretty (but tiny) screen captures. There had to be a better way.
Her intended output device is a Polaroid camera attached to a CRT. You put in a sheet of film and set the scope to do a time exposure, and it scans the film one line at a time. The Polaroid adapter is the little black box on the right of the main console:
Even if I could find the proper film, I would end up with a useless hard copy. I would then need to scan it right back in so I can share it online. That way lies madness.
While the NTSC frame grabber can’t cope with the signal on the photo CRT, I could always sample it with a “scientific” data acquisition device. These modules are designed to minimize latency and artifacts, to produce the most accurate possible representation. This is critical for manufacturing and scientific applications, where a difference of a few nanometers can make or break a project. But if I just want nicer photos, the cost of these beasts ($1k and up) is out of the question.
Slow down there, pixel clock
According to the manual, the film is scanned at up to 1940 lines of resolution, in a programmable period of up to 320 seconds. What would it take to sample that directly, assuming a 4:3 aspect ratio? Let’s do some pixel math:
Even though it’s a much smaller frame, the 30 Hz refresh rate pushes the pixel clock up over 11 MHz. No way a sound card can keep up with that, which is why faster ADCs exist for video sampling.
Audio to Video
So I dug into the old box of audio cables and found a 3.5mm to RCA cable. I had a bunch left over from my bullet time camera rig project (one came free with each camera).
I connected the photo signal to the left channel and the horizontal sync signal to the right. I also added a couple of high-value resistors to limit the current and hopefully avoid damaging the scope. Then I hit the PHOTO button, and made a WAV file recording at 48 kHz.
I ended up with a 50MB WAV file full of data.
I had to turn the gain way down to avoid clipping. Adding a potentiometer to tone down the input volume would probably be a good thing.
The next step was to turn it back into a picture. I used numpy, audiolab, tifffile, and about 4 lines of python.
Here is the shot as captured with the NTSC frame grabber:
And here is my first attempt at wav2tiff:
There are so many problems! The aspect ratio is wrong. The sync wanders all over the place. I’m missing half of the contrast depth. And what is all of that extra junk on the right?
Fortunately, these are all software problems. I added a few more lines of python, scaled and cropped it appropriately in Photoshop, and ended up with this:
Much cleaner! That’s a 3.4 megapixel image, scaled to fit on this web page. Click it to zoom all the way in.
I believe the black streaking effect is due to poor brightness and contrast settings. Since this is a time exposure, there is a lot more charge on the sample, making it brighter and a little overcharged. While the settings were fine for the NTSC fast scan, they’re too bright for a 320 second exposure.
You can see a similar effect on my earlier shots of pollen taken with the NTSC grabber. I think I simply need to turn the brightness down.