eMotion 5075 teardown

In my PhD lab we had an epMotion 5075 pipetting robot. I had a like/hate relationship with this machine. Like: it’s an impressive, precision-engineered, piece of hardware. Hate: the software is appalling. Writing protocols for it was slow, frustrating and generally awful, and there was a general lack of flexibility in what one could make it do.

Recently I heard that the lab was having a clear out, including disposing of this (pricey when purchased) robot and I asked if I could adopt it in preference to the scrapheap, which I was kindly allowed to. I’m not in a wet-lab at the moment so for now it will live in a garage, but I did want to have a peek inside to have a better understanding of how it works, and to work out whether it would be possible to customise it to be more flexible.

If I were buying my own scientific hardware I would always go for the upstart companies like OpenTrons and Incuvers which tell you how their hardware works and allow you to do whatever you want with it. With the epMotion, by contrast, if you want to use new labware you have to send a physical version of it to the company which they measure to generate a proprietary calibration file.

I was given some hope that it might be possible to customise the robot from this video, in which someone has replaced all the electronics of the robot with a standard board for a CNC machine:

But other than that I could find very little on the internet about what is inside these robots. I think that’s a shame, and now I have one at my disposal, without a warranty. So here is a run-down of what happens when you take it apart, in case it is useful to anyone in a similar position.

First steps

The back panels come off very easily with a hex-key and expose the computer that runs the machine. This runs some version of Windows, maybe Windows CE. It has USB and ethernet ports although to my knowledge with my version of this robot these can’t be used for anything useful. In general I doubt there is any easy way to make this computer do anything other than what Eppendorf has programmed it to do, without access to the underlying source code.

Removing the top required in my case removing a little bit of double sided tape at either side, in addition to two hex-key bolts.

There is a heavy-duty belt for the X axis with a big stepper motor.  My robot had been essentially unused for several years and the rail over which the X-carriage runs had become covered with a sticky substance. This caused the motors to stall mid-run, but cleaning them off with some alcohol resolved this issue.

The computer that is the brains of the operation – unfortunately unlikely to be easily repurposable.

Basics

Each of the X, Y and Z axes is controlled by a stepper motor (the X-axis one is this). They each have optical endstops with 4 wires. In the video above these endstops have been replaced with mechanical switches but it really should be possible to use them as-is.

X-belt and optical end-stop
Y-axis stepper motor, belt, and optical end-stop.

Cabling

One of the challenges of making a many-axis robot is that signals have to be carried to each successive axis, all of which are connected together. So flexible cabling is needed -but at the same time it has to not get in the way or fall into the samples. In the case of the epMotion this is carried out with ribbon cables like this:

But it quickly becomes apparent that this cable doesn’t have enough wires to be simply directly connected at the other end to stepper motors / endstops / etc. Instead it seems that this is some sort of serial cable that carries data signals to a series of other microprocessors, one on the robot’s pipetting arm, and one for each of the Y and Z axes, which then interface with the Y motor, Z motor, the tool locking motor, the pipetting motor, the tip-ejecting actuator, and the range-detector.

If you want to hack this thing you’ll have to decide whether you want to have to make and mount 4 separate pieces of control hardware, or to replace the cabling with a much thicker set of wires.

Pipetting arm

Lurking under the metal cover of the tool arm is a profusion of electronics. There’s a lot to do. An (infrared?) sensor to measure distance, and actuation of grabbing a tool, identifying it, pipetting up and down, and ejecting a tip.

Selecting/using tools

One of the very impressive things about the epMotion robot is its ability to change tools during operation. It can choose from a variety of single channel and multichannel pipettes, and even a plate gripper.

Tools

How does this process work?

The tool arm has two coaxial motors. One is, I believe, a simple DC motor with a very low gearing. It rotates a bit of metal internal to the arm which causes it to firmly grip whichever tool it is currently over. I’m not quite sure how the robot knows when this rotation is finished. My suspicion is that it detects the change in current flowing through the motor when the motor stalls at the end. Certainly if you disconnect this motor, the robot is able to detect that ‘the engine is not responding’, and informs you so.

Looking up at the inside of the tool gripper to see how it works.

When one examines the pipettes themselves one notices they have electrical contacts, but these are simply used to tell the robot which tool is in what place. The pipettes are in fact mechanical rather than electronic devices. They all have the same rotatable top-piece, and as this is spun by a stepper motor in the tool arm they aspirate/dispense liquid (or in the case of the gripper, grab and release). As this piece is rotated the tool begins to extend a rod out from it. Inside the tool-gripper this rod must make contact with a switch, and this is used to “home” the pipette to ensure the robot knows the position of the plunger.

Homed tool with thin rod extended to make contact with switch. Electrical contacts for tool ID visible to the right.

Prospects for customisation

I’m going to pause my hardware work here, because it isn’t yet clear exactly what the application of the robot will be for me and I don’t want to destroy any necessary functionality.

If I had continued I would have one way or another tried to marry up the epMotion hardware with the open-source OpenTrons robot-control software. This basically means adapting the hardware such that one knows how to control it and then writing a custom driver for the OpenTrons software.

I do think this is completely achievable. The video above already shows how 3-axis control is possible, using a standard CNC board. Controlling aspirate/dispense as a fourth axis should be similarly simple. If my understanding of how the tool interlock works is correct than that also wouldn’t be too challenging – one would just need to measure the current flowing through the motor. An even simpler strategy would just be to keep one tool locked onto the machine.

One decision one would have to make would be whether to have a single control board and have lots and lots of wires running to the tool-arm, or to use the existing ribbon cables and have a separate controller on the tool arm controlled over serial. I suspect the latter might be the better approach.

More generally, if I do this I will have to consider whether I want to be limited to expensive epMotion robot tips, the only ones compatible with any of these tools. I suspect the answer is no. In that case I might end up bolting an OT-2 electronic pipette to the pipetting arm, though this again loses the advantages of tool-changing. Or maybe I’ll go with something completely different like a vacuum pump and a peristaltic pump – we’ll see.

In general none of this looks trivial, and one is almost certainly better off just buying an inexpensive OT2. Still, it’s nice to have a better understanding of what is going on inside this intricately engineered machine.

 

Update:

It has just occurred to me (another useful reason for writing things down) that there may be an easier and less invasive way to get control of this thing. If one can reverse engineer the serial control that the computer uses to control the Y-axis, Z-axis, tool interlock, aspiration tip ejection (and distance measuring) then one can get control of all of these without messing with their hardware. It seems possible that this could be achieved relatively simply (if they are sent in a text-based format) and when I have access to the machine again in 6 months time I will investigate. The 8 leads in the ribbon cable could be: V+, GND, Y-out, Y-in, Z-out, Z-in, pipette-out, pipette-in.

BigGAN interpolations

The state of the art in image generation is BigGAN.

Now, some trained models have been made available, including the capacity to interpolate between classes. I made a colab to easily create animations from these.

They are pretty fun.

What is more, they make it clear that the latent space clearly captures very meaningful shared properties across classes. The poses of quite different animals are conserved, and “cat eyes” clearly map onto “dog eyes” during interpolation. These sort of properties suggest that the network ‘understands’ the scene it is generating.

Continue reading “BigGAN interpolations”

Adventures with InfoGANs: towards generative models of biological images (part 2)

In the last post I introduced neural networks, generative adversarial networks (GANs) and InfoGANs.

In this post I’ll describe the motivation and strategy for creating a GAN which generates images of biological cells, like this:
Continue reading “Adventures with InfoGANs: towards generative models of biological images (part 2)”

Adventures with InfoGANs: towards generative models of biological images (part 1)

I recently began an AI Residency at Google, which I am enjoying a great deal. I have been experimenting with deep-learning approaches for a few years now, but am excited to immerse myself in this world over the coming year. Biology increasingly generates very large datasets and I am convinced that novel machine-learning approaches will be essential to make the most of them.

At the beginning of my residency, I was advised to complete a mini-project which largely reimplements existing work, as an introduction to new tools.  In this post I’m going to describe what I got up to during that first few weeks, which culminated in the tool below that conjures up new images of red blood cells infected with malaria parasites:
Continue reading “Adventures with InfoGANs: towards generative models of biological images (part 1)”

How I stumbled upon a novel genome for a malaria-like parasite of primates

Sometimes in science you come across things that are definitely interesting, and useful, but which you don’t have time to write up properly for one reason or another. I’m going to try to get into the habit of sharing these as blog-posts rather than letting them disappear. Here is one such story.


Not long ago I was searching for orthologs of a malaria gene of interest on the NCBI non-redundant database, which allows one to search across the entire (sequenced) tree of life. Here is a recreation of what I saw:

I was surprised to see that nestled among the Plasmodium species was a sequence from a species called Piliocolobus tephrosceles. Continue reading “How I stumbled upon a novel genome for a malaria-like parasite of primates”

Saving 99.5%: automating a manual microscope with a 3D printed adapter

TL;DR: Some 3D-printing hackery can create an automated microscope stage from a manual stage for ~0.5% of the cost from the manufacturer.


I have always wanted access to a microscope with an automated stage. The ability to scan an entire slide/plate for a cell of interest seems to unlock a wealth of new possibilities.

Sadly, these systems cost quite a bit. The lab I work in now has a Leica DMi8 microscope with automated movement in Z. But XY movement is (on our model) still manual. It is possible to purchase an automated XY stage for this microscope, but the list-price quote is around £12,000 (including stage, and control hardware and software).

I’m not going to argue that this price is unreasonable. I am sure that the manufacturers of scientific equipment spend a lot of time and money innovating, and that money has to be made back by selling devices which have relatively small production runs. Nevertheless, the result is that the costs of kit that makes it to market are fairly staggering – and this prevents someone like me from being able to play around with an automated stage.

But I still wanted to experiment with an automated stage! So I wondered how easy this would be to do myself. After all, we have a manual stage, and we move it by rotating two knobs. Couldn’t I just get motors to turn those instead of doing it with my hand?

As I thought this through further I realised it was slightly complicated than this. Firstly, the knobs are co-axial, making them rather harder to deal with than would be two separate shafts. And secondly, as you rotate the X-knob, the shaft moves in X.

So the motors need to be able to move with it. But they also need to be to rotate and exert a twisting force on the knob – so they need to move linearly but be locked in one orientation.

Hardware: 3D printed pieces, 2 stepper motors and a RAMPS controller

I made a quick design in OpenSCAD

Basically the first knob,which controls movement in Y, is simply connected to the mechanism by a (red) sleeve which connects to a motor below. The knob above, which controls movement in X, is placed inside a (blue) sleeve which covers it in a gear. That gear is turned by a (turquoise) gear turned by a second motor. Both motors are mounted on a (transparent) piece which also connects them to a LM6LUU linear bearing which allows them to slide but keeps their orientation constant.

I printed out these 3 pieces – then tweaked the dimensions a little to be more snug on the knobs and printed them again. The final STL files, and the SCAD file that generated them are available on Thingiverse.

To control it I connected the steppers to a trusty RAMPS 3D printer controller. These cost £30 with a screen and a rocker controller (the Leica hardware to control a stage is ~£3k). Since the 3D printer controller is also all set up to control the temperature of a hot-end and a heated bed, if you want to add warm stage down the line this might be ideal.

Initial tests controlling the position using the system using the RAMPS controller went well, and let me calibrate the number of steps per micrometer.

Software: MicroManager

Regrettably, the Leica software isn’t going to allow you to easily hook it up to an Arduino-based controller. But, as ever, open-source software comes to the rescue. Micro-Manager is a very advanced ImageJ plugin that can connect to the Leica camera, and to the microscope itself to control filter cube positions, Z-focusing, etc.

Don’t expect quite the user-friendliness of Leica software from Micro-Manager, but do expect a wealth of packages to perform common operations in automated-microscopy (Leica charges ~£2.5k for the software to revisit a position multiple times – which was included in the quote given above).

Theoretically, MicroManager even allows you to control XY position using a RAMPS controller – someone has already written a package for exactly this board. This step, which should have been trivial, was actually the most complicated. The device adapter is designed to ask the RAMPS controller for its version, and somehow I could never make my board submit a response that the software was happy with. I had to download the MicroManager source and remove the code that checked the version. Successfully setting up the build environment for Windows took an age. Do get in touch if you have a similar project and want the DLL I built [update: DLL here, I offer no guarantees at all that it will work. This is an x64 build which will only work with a recent nightly build] [update 2: Nikita Vladimirov has followed up on this and released the changes he had to make to MicroManager]. Anyway, to cut a long story short I got MicroManager to talk to the RAMPS board successfully.

Testing by making a 100X oil immersion slide scanner

Now to put it into practice.

I wrote a Beanshell script to scan a slide in X and Y and capture images. In this case I captured images in a grid 40 microscope images wide by 30 microscope images high, for a total of 1200 images.

This took a few minutes – try doing that by hand.. Then I stitched them together with the MIST plugin. The result is a 27,000 x 12,000 pixel image, featuring a whole lot of red blood cells. You can zoom in on the version below. This was taken with a 100X oil immersion objective, at which magnification the smallest motion of the stage is a substantial fraction of the image, but still allows enough overlap for stitching.

Fun! Still a bit more experimenting to do, but I’m hoping to get this acquiring tagged proteins from 96-well plates.

Caveat for anyone who tries to implement this: obviously be very careful not to create significant non-twisting forces on the coaxial knobs – you don’t want to damage your stage and ruin the alignment.

Petition grapher

The houses of parliament in the UK host a website of petitions. When a particularly popular petition is trending it can be addictive to refresh the page and watch the numbers going up behind your eyes. But of course what one really wants is a graph.

So I built this tool which regularly queries the parliament API and stores historical petition signatures in a database which it uses to draw graphs.

The Up-Goer Five Text Editor

When Randall Monroe released the amazing Up-Goer Five comic, I was inspired to hack together a customised editor so that I could explain my own research in the limited vocabulary of the thousand most common words.

For me the most exciting bit about this was that I got an email from Mr. xkcd himself. It also became a bit of a science-communication craze. A lot of scientists got in on the action (some kind people have made a collection here).

The editor has featured in The Guardian (and again). More recently it’s even been used for conference sessions, which are pretty fantastic. (The 9th talk in that playlist is particularly excellent). It’s also quite popular as a science communication training exercise, although it goes without saying that no-one is suggesting that this is the best way to communicate.