Purposing a Makesmith CNC

Back in May of 2014 I was a Kickstarter supporter of the Makesmith CNC, which was an attempt to build an extremely inexpensive CNC router. The project was 822% funded, and they shipped all the kits in November, almost on time. Yay!

I began to put mine together right away. They had pretty good video tutorials on how to do it, but not much in the way of written documentation, and there were some holes in the tutorials. I made a post or two on their forum about my experiences. There were a few minor updates to the videos, made as overlaid titles, but nothing very substantial. I set the project aside to wait for the rest of the community to catch up and participate in improving the design.

Over a year later, in December 2015, I picked it up again. I found no improved documentation and relatively little more in the forums, so I just completed the assembly of the mill using my own best guesses. After jumping through some hoops to get their Macintosh software running, I was able to run some initial tests. I wasn’t impressed. As expected, it was really slow, but it’s hard to appreciate how slow without seeing it in person. Also as expected, its axes were pretty wimpy, barely powerful enough to overcome their own friction. One of the cost saving measures that makes the design feasible is that they don’t really try to prevent lost motion, they just measure it with a closed-loop feedback system and try to compensate. All this is more or less as advertised.

My particular mill had worse problems. It had a tendency to get stuck in the X and Z axes. This is undoubtedly due to alignment problems with the rails, which arise from some combination of the low-budget design and the assembly procedures I used. Because of the low-budget construction, though, there is no easy way to adjust the alignment after assembly and gluing. I’m sure there’s some way to make the mill work to expectations, but at this point it became clear to me that this mill was going to be frustrating to use. It was just too compromised to reduce cost.

By that time it was clear that Makesmith was never going into production with these machines, either. They filled the Kickstarter orders, and stopped. The software hasn’t been updated, and the user forum looks abandoned (overrun by forum spam). Maybe Makesmith reached the same conclusion: that the Makesmith CNC just wasn’t viable. They ran a new Kickstarter project in November 2016 to make a much larger CNC router at a very low cost point. I hope they and their Kickstarter backers have better luck this time.

Anyhow, I decided to give up on using the Makesmith CNC. It was fun to build and educational, so I got my money’s worth (more or less), even though I never even mounted a spindle (i.e., a Dremel tool) on the mill. I began researching commercially available mills and a few months later ended up buying the Tormach PCNC 440 with all the goodies for about 55 times the cost of the Kickstarter Makesmith CNC. Needless to say, it’s in a whole different class.

All of which is just a long explanation for why I have the carcass of a Makesmith CNC sitting around, capable (barely) of X-Y-Z positioning, but without a purpose. I also had a cheap USB microscope, which was adequately functional but which came with a nearly useless articulated-arm stand. Recently it occurred to me that the Makesmith would work as a positioner for the microscope. With a solid stand and relatively precise orthogonal positioning axes, the microscope would be transformed from a toy into a tool. Holding a lightweight plastic microscope and positioning it interactively under manual control is far less demanding for the Makesmith chassis than working as a milling machine.

I wanted interactive control of the positioner, independent of any computer software, so I bought a joystick on a breakout board and wired it up to spare I/O pins on the Arduino Mega 2560 that serves as the brains for the Makesmith CNC.

Joystick

I then discarded the Makesmith firmware and wrote a dead-simple Arduino program that reads the joysticks and moves the axes, without any of the complexities of feedback or G-code interpretation or much of anything else. The software is on Github. I also added a power cable so that the Arduino was powered from the Makesmith’s power supply, instead of from the host computer’s USB port. Here’s the final lashup:

MicroSmith

The microscope is a cheapie, but it gives decent results. My test subject here is a Raspberry Pi board. Here’s the view at minimum magnification, at 640×480 pixel resolution:

Wide small

For scale, the big black chip is 9mm square. Here’s the view at maximum magnification and maximum resolution:

Narrow big

We see just a few of the pins on one side of the same chip. Near the middle of the image is a via, a tiny hole that routes the circuit to the other side of the board. Those little white specks inside the via are formed by a silk screened annotation on the back side that happens to overlap with the hole. At this magnification, the depth of field is pretty shallow, so the top of the chip and of C78 are quite out of focus.

The motion of the Z axis is quite sufficient for focusing the microscope, even at maximum magnification. It could use more vertical travel for working on larger subjects, but that’s true of every positioner in the world. The X and Y axes are still slow, but even at minimum magnification they move the image about as fast as you’d want them to. The speed only becomes an issue when you need to move the microscope to a whole different area. Often it’s easier to just move the subject around on the platform.

The microscope can capture video, too, but the motion isn’t really smooth enough for that to be impressive. I’m using miXscope software to control the microscope from a MacBook Air. That software has a number of useful features for technical microscopy, but it’s a bit long in the tooth and a bit crashy on current versions of macOS.

The next obvious application for a microscope with a positioner is to automatically photograph a grid of overlapping images at different X-Y offsets, which can then be stitched together to create a larger high-resolution image. To do this I’d need to add back some of the complexity of the control firmware, so it’s on the back burner for now. Another variant would be to take multiple exposures at different Z offsets, which can then be merged to increase the effective depth of field. More projects!

This was a quickie weekend boondoggle, except for waiting for the joystick board to arrive. Well worth the effort to add an improved tool to the lab.

Dongles considered harmful

Helping to search for a missing copy-protection dongle (for embroidery software) reminds me just how awful that kind of copy-protection is.

I can understand why software publishers are tempted to use it, especially for niche market titles that command relatively high prices. The arithmetic might even work out in the publisher’s favor for hardcore engineering software used almost exclusively by corporate minions on big-budget projects. I imagine that some of the users of the embroidery software are commercial users doing embroidery for customers, and those users pretty much have to pay whatever the publisher demands. The commercial-grade embroidery machines certainly aren’t cheap; the software doesn’t materially alter the capital budget even when it’s grossly overpriced.

But there are also prosumer embroidery machines aimed at advanced hobbyists. Those machines aren’t exactly cheap either, but the price of the software really does drive the cost way up. Probably with the software so expensive, and hobbyists using it “just for fun”, there would be some who would use the software without paying, and some of those wouldn’t be technical enough to circumvent the dongle copy protection. The license fees these customers pay are the upside of copy protection for the publisher.

The downside, of course, is that all the other customers are treated like thieves. They are inconvenienced every time they run the software, for the sole benefit of the software publisher to whom they have already paid a wad of money. When the dongle goes walkabout, as it inevitably does at the worst possible time, they are prevented from doing any work until the dongle can be located or replaced. It sure doesn’t help the customer feel affection for the software or its publisher. Word of mouth suffers. Bad dongle experiences (not to mention high prices) poison the potential community of users. Great software that could have been a runaway favorite ends up feeling like a necessary evil.

Ugh. So far, we still haven’t found the dongle, so we may get a chance to find out how well the publisher and its local dealer are at customer support.

DLNA is not a total loser

I doomed myself by saying that streaming from my Mac Pro to the Sony Playstation 3 was working extremely well. A few days later, it was completely broken. Here’s the story.

The ingredients: two houses with home theater equipment, a collection of movies on DVD, a variety of computers available, an Apple TV, and a PS3. The discs can’t be in both places at once, so I ripped them to hard disk for use in one house, and stored all the discs at the other house for playback there. I am interested in the extras found on many DVDs, not just the feature, so I was careful to rip all the video selections on each disc. I kept them organized with a folder for each product (one disc or multiple). For TV series collections, I’d include an episode number in each filename so they could be retrieved in chronological order. The extras for each product I’d gather together in a folder named Extras and give them sensible filenames. The idea was to bypass the awkward silly menu structure imposed by the DVD authoring and just have access by reasonable names to each significant piece of video.

The resulting collection filled up most of two terabyte external Firewire drives, after re-compression. There was no settop box available with that much storage, at least none at a reasonable price. So, I needed a way to play the video from the external drives. I leave my Mac Pro on all the time, so the obvious solution was to leave the drives mounted on the Mac Pro and stream the video to the living room on demand.

OK, it’s a Mac, and Apple generally does a superb job of getting user interfaces right. So the obvious solution was an Apple TV, streaming from iTunes on the Mac Pro. Total and complete disaster. iTunes was happy to import everything from my complicated directory structure, and to serve it up on demand to the Apple TV. As a flat list. One, single, very long flat list in ASCIIbetical order. That meant that under ‘M’ in the list I had six separate things named “Mission Overview” (yes, I have all the Star Trek series on DVD). Episode 01 of every season of series sorts together in the list, before any of the Episode 02 files. Just a hideous mess.

Waiting through a couple of major revisions in Apple TV software didn’t help. Take Two, then 3.0, still lame. Now, I’m sure this is the right design for some class of user. If you have a few dozen titles and you keep only the main movie from each disc, maybe this is very nice. But it doesn’t match my requirements.

Going down the Apple TV route, I could either keep waiting for Apple to change its design to suit my needs (which seems increasingly unlikely as Apple TV converges on a design that seems more and more oriented toward the iTunes Store), or hack the Apple TV with third-party software. I know there are options available from third parties, and as a reasonably serious computer guy I’m not too intimidated by the prospect of a complicated conversion process. I could make it work, but I never did. For one thing, I wasn’t looking forward to devoting the time it would take to evaluate the various third-party options. For another, I do like the way the Apple TV handles music, and I didn’t want to screw that up. Nonetheless, I was just about ready to dive in, because I needed a way to have convenient access to all that video.

Somewhere in the middle of this drawn-out process, Blu-ray happened, and I bought a Sony Playstation 3 as a Blu-ray player. I was distracted from the DVD archives for a while by newly-purchased Blu-ray movies, and didn’t pay too much attention to the other capabilities of the PS3.

A few weeks ago I happened to upgrade the software in the PS3, just because there was new software available. After installing the upgrade, I again perused the menus to see if there was anything new and cool. I didn’t find anything new of interest, but I did notice again that the PS3 had a way to look for a video server on the network. Hmm! I wondered if it was any less lame than the Apple TV. I didn’t think it was likely, but it might be worth a try.

A quick research session on the Internet taught me that I needed a DLNA server, and that a well-respected one for the Mac was MediaLink from Nullriver. And hey, I already had it licensed and installed on my machine, from some previous experiment with video streaming. I fired it up and pointed it to the two external drives, and went back downstairs to see what it looked like on the PS3.

There it was! The PS3 had already automagically detected the MediaLink server and tagged it Potato, the host name of my Mac Pro. If I just clicked on that, I’d find out just how awful a job the PS3 software would do with my media. Click. It showed me the names of the two drives. Click on one of those, it showed me the top-level directories on that drive. In fact, the whole directory hierarchy I’d painstakingly laid out — and Apple TV promptly flattened — was there to browse. That’s exactly what I wanted. The browsing was even pretty snappy, over my wired Ethernet.

What’s more, the video playback worked, without much annoying delay or any glitches. Even fast forward was smooth and predictable, so I could skip the horrendous theme music when watching episodes of Enterprise. Life was good. And then I made the mistake of saying so, and the very next time I tried to stream video, it failed utterly to work.

Maybe this was a problem introduced by the new PS3 software, and with a little luck Sony had already fixed it. I checked for another new version. Sure enough, there was another update. I let the PS3 update itself and tested again. Still busted.

Maybe this was a known problem with MediaLink — I was running an old version, after all — and Nullriver had already fixed it. I downloaded the latest version of MediaLink, 2.0b1 (a beta release) and installed that, after figuring out that it installs as a preference pane and not as a regular application. Another test, another complete failure.

Oh, the PS3 could still see the MediaLink server. It still showed up tagged Potato. But when I attempted to browse the server, the PS3 claimed “There are no titles” and in the upper right corner, a message box appeared heralding “DLNA Protocol Error 7531”. Wow, what a user-friendly error message. For a translation, I turned again to the Internet, but I didn’t find much in the way of specifics. A lot of people were having random-seeming problems with DLNA protocol errors, including number 7531, but nobody seemed to have much of a clue what exactly it meant or how to fix it.

Well, no problem, right? I can just look up the DNLP specification and find out what that error code is defined to mean. No, I can’t, because the DNLA protocol specification isn’t public. It costs $5000, and I can only imagine what kind of agreements I’d have to sign before I’d even be permitted to pay. In any case, I doubt there’s a lot of precision in the definition of error codes even in the full spec.

With the spec unavailable, I gleaned what I could from various articles discussing DLNA. One particularly useful post was Why do I hate DLNA protocol so much? by Ben Zores, author of GeeXBoX, an open-source Linux-based media center software distribution. From Ben’s rant I learned that at the bottom of multiple layers of directory service and connection management cruft, all that’s really happening is that the server is providing the client with an HTTP URL from which to stream the media.

Armed with that information, I fired up Wireshark to trace the network packets going between Potato and the PS3. Every 60 seconds, I saw a short TCP transaction, a single query and its response. Here’s the query from the PS3 to Potato:

GET /MediaServer/DeviceDescription.xml HTTP/1.1
Host: 192.168.1.74:9386
Date: Sat, 12 Dec 2009 08:20:37 GMT
User-Agent: UPnP/1.0
X-AV-Client-Info: av=5.0; cn="Sony Computer Entertainment Inc."; mn="PLAYSTATION 3"; mv="1.0";

This makes a lot of sense. The PS3 is identifying itself, and asking for something called /MediaServer/DeviceDescription.xml — a generic-sounding name, so it’s probably straight out of the protocol spec. Notice that Potato’s IP address is 192.168.1.74. You can’t see it here, but the PS3’s IP address is 192.168.1.64, so both are on the same subnet with a netmask of 255.255.255.0.

The response from MediaLink on Potato to the PS3 consisted of a similar header followed by a 55-line XML document. Here’s the header:

HTTP/1.1 200 OK
Content-Type: text/xml; charset="UTF-8"
Content-Length: 2229
Connection: close
Date: Sat, 12 Dec 2009 08:20:36 GMT
Server: Mac OS X/10.x.x, UPnP/1.0, Nullriver HTTP Server/3.0

So far, so good. The Mac OS X server is responding and is going to send a 2229-byte XML response. Here’s the first few lines of the XML:

<?xml version="1.0"?>
<root xmlns="urn:schemas-upnp-org:device-1-0">
	<specVersion>
		<major>1</major>
		<minor>0</minor>
	</specVersion>
	<URLBase>http://192.168.179.1:9386/MediaServer/</URLBase>

Whoa. Look at that IP address given as a hostname in the URLBase tag. It’s 192.168.179.1. That’s not Potato’s IP address, and it’s not even on the same subnet. If MediaLink is telling the PS3 to get its media from that address, it’s no wonder it fails to work!

So, where the heck did that address come from? Googling that particular IP address didn’t reveal anything special. I tried putting that URL from URLBase into my browser, not really expecting to get a response since I knew there was no route to any such subnet on my network. But there was a response, and it was just the kind of terse and somewhat cryptic response you might expect from a server that’s expecting to respond to a specialized client program. I tried trimming off the filename part and submitting just the IP address to the browser, and got the standard Apache web server response for an unconfigured server. Some computer, somewhere on my network, was somehow being reached by this URL and responding!

When you already have Wireshark open, every networking problem looks like a job for packet tracing. So I set up a Wireshark capture filter to log packets to and from the mystery address 192.168.179.1, and set the trace in motion. Nothing. I repeated the browser access to the mystery server. The access succeeded again, but still no packets were logged by Wireshark. I threw the Wireshark capture filter wide open and tried again. Still, no packets to or from 192.168.179.1 were logged.

OK, that leaves just one thing. When you start a Wireshark trace, you have to specify which physical interface is to be traced. Potato has only one physical network interface active, the first wired Ethernet port en0, so naturally I was tracing on en0. The mystery host must be on some other interface, somehow. The command to list network interfaces is ifconfig, and that’s the next thing I ran.

That told the story: ifconfig showed an interface called vmnet8 that was using the IP address 192.168.179.1. In fact, vmnet8 was listed before en0. I speculated that MediaLink was enumerating the IP addresses of the available ports, and choosing the first one it found.

Another resort to Google quickly revealed that vmnet8 is a virtual networking port installed as a kernel extension (kext) by VMWare Fusion. VMWare doesn’t try to dynamically load it as needed, it just leaves it installed forever to clutter up your kernel. The VMWare Fusion on Potato was a long-expired demo version I didn’t need, so I simply uninstalled it. The vmnet8 port disappeared, without even a reboot. Repeating the trace of packets between the PS3 and Potato, I could see that the URLBase in the XML file had changed to 192.168.1.74, Potato’s IP address on en0, as expected and desired. (If you’re having this problem and need to keep VMWare Fusion around, I don’t know what to suggest other than to complain to MediaLink for a way to specify which interface or IP address it tries to use.)

I ran downstairs and found that I could again browse the server file hierarchy on the PS3. I declared victory and went straight to bed, it being long past bedtime by then.

The next day when I had a bit of time on my hands, I decided to take advantage of my newly repaired video streaming to watch an episode of Enterprise. It didn’t work. The PS3 couldn’t even see the server. OK, maybe I had left the server shut down in my groggy state the night before. I went upstairs and restarted it. Now the PS3 could see the server and browse the hierarchy again, yay. So I fixed some lunch and sat down to watch. Hit Play and nothing happened for a few seconds. Then the PS3 changed the title of the video I wanted to watch to “Corrupted Data” and returned to the hierarchy browser. Nothing would play. Arrgh. This time the error in the upper right corner was “DLNA Protocol Error 2110” or “DLNA Protocol Error 2101”.

This time Google found me a specific answer when I searched for error 2101. I learned that there was a new problem with the alpha version 2.0a1 and beta version 2.0b1 of MediaLink, having to do with sending thumbnail images to the PS3. It crashes the background program MediaLinkHelper.app, which is the actual DLNA server program, which I confirmed by examining the system log in Console.app. Once the server crashes, of course nothing works. The solution given is to delete the plug-in that tries to handle thumbnail art, /Library/PreferencePanes/MediaLink.prefPane/Contents/Resources/MediaLinkHelper.app/Contents/PlugIns/AlbumArtTranscoder.mltranscoder. I did that and tested again.

My episode started to play, hurray! After the teaser was over and the theme music began to play, I hit fast-forward, as usual. The picture froze and went silent. That’s not how the fast-forward looked and sounded before, and it’s not an improvement. I hit play. I expected to get video, either with or without having skipped ahead by the amount it should have been fast-forwarding. Instead, I continued to hear silence and see a frozen screen, for quite a few more seconds. Before I did anything else, it eventually did start playing, having fast-forwarded ahead about the right amount.

Failing to find anything about this problem with Google, I assumed it was a bug in the beta version of 2.0 that I had “upgraded” to. I foolishly did not save the version 1.54 (I think) that I was originally running. Fortunately, Nullriver makes available version 1.72 for customers who are running Tiger, since the 2.0 versions require Leopard. I downloaded version 1.72 and replaced the beta version with it, and got back the nice, smooth fast-forward behavior I had before.

Once again I have declared victory, and I was able to watch my episode of Enterprise without any additional problems. So far.

TinyURL Considered Harmful

Back in late 2006 I wrote the following as a letter to the editor of Motorcycle Consumer News. They printed it, and stopped using TinyURLs! One small lurch forward.

To: editor@mcnews.com

I wish you wouldn’t use tinyurl.com in the magazine, for a number of reasons.

First, any error in a tinyurl code makes the link completely useless. This might be a printing error in the magazine, or a typing error on the subscriber’s part. Either way, there’s no way to guess where the link was intended to go.

Second, a careful computer user is very reluctant to visit a web site “blind” without any idea of where he’s going. Tinyurl makes that sort of reckless behavior mandatory. Even if we completely trust the magazine to vet web sites for safety, any typing error and we could end up anywhere on the web.

Third, the reader may not be sitting in front of the computer as he reads the article. If there’s a real URL on the page, he at least has a chance of remembering what web site was mentioned, so he can find it later when he’s at the computer. Likewise, when reading the magazine he may recognize the URL as one he’s already visited, saving a trip to the computer entirely. There’s no chance of remembering or recognizing a tinyurl.

Fourth, occasionally the writer will succumb to the temptation to give a tinyurl without ever even mentioning the actual company or product he’s referring to. This renders the whole reference completely useless unless the reader is at a computer and able to type in the tinyurl correctly.

Fifth, tinyurl.com could disappear without notice, or turn evil somehow, and where would that leave you? All the tinyurl links in a subscriber’s collection of back issues would be obsolete.

I realize that full URLs are too big to fit nicely in narrow text columns in the magazine. I would suggest that there’s a perfectly good standard solution to problems like this one: footnotes. Instead of a tinyurl, put something like [Link 1] in the text, and put the link at the bottom of the page. You can let it span multiple columns in order to minimize line breaks within the URL. The footnote reference is even more compact than the tinyurl, and the full URL at the bottom of the page avoids all of the disadvantages mentioned above.

Thanks for listening.