feature article
Subscribe Now

Going With the (Fluid) Flow

Quiz time. There’s a “high-tech” technology that has a very unusual asymmetry about it. In one application area, it ships enormous volumes to consumers and businesses alike. That’s pretty much its only high-volume play to date. Meanwhile, the vast bulk of development projects using this technology are in an area that has absolutely nothing to do with the high-volume space.

What is it? (No fair looking at the title for clues.)

It’s “microfluidics.” Since the study of fluids is a subset of mechanics, and, since there are typically electronic aspects to these systems, you’ll often see it included as a MEMS technology, although if you attend a conference like the Microfluidics 2012 show that took place recently, you might be forgiven for being surprised that it has anything to do with MEMS.

Remember chem lab? It’s like that: beakers and flasks and pipettes and titrations and stoichiometry and all that messy stuff that sent many of us running for a digital lab. You can measure the results of a reaction 100 times and get 100 slightly different results, requiring statistics to sort it all out, while, on the other side of the tracks, you can measure the output of a NAND gate 100 times and it’s always the same*.

But, apparently, run though I did, there’s no escaping. I’ve managed to get a few decades’ reprieve, but it’s now coming back, and I have to wrestle with the concept of a lab-on-a-chip. And this brings not only the horrors of chem lab; tagging along with it is biology, the ultra-messy life-science stuff. For, even though the biggest-selling application of microfluidics is in ink jet printing (all of those cartridges that, added up, far out-cost the printer itself), research and development areas overwhelmingly tilt towards medical applications.

Discussions of why this is so boil down to reactions and reagents. If you do macro biochemistry, then you need macro amounts of all of the inputs into the process. Those may involve substances “donated” by a person – blood, perhaps. And we all know how much fun those are to procure. So the less needed, the less we dislike it; microfluidics reduces the amount of material needed by orders of magnitude. (And it’s not just a matter of like or dislike; we may simply not have access to a large quantity of some substance, like a bacterial sample.)

As is typical of a conference dedicated to a nascent technology like this, there were discussions gently promoting solutions as well as those laying out challenges. And in organizing my thoughts on the whole thing, it seemed that the overall problem could be divided into five aspects:

  • Building a place for fluids to go
  • Making the fluids move
  • Controlling that flow at a detailed level
  • Enabling some sort of useful reaction
  • Getting visibility into the results of the reaction

The first topic starts with technologies better suited to old-school, greasy-hands approaches like milling, hot embossing, injection molding, and casting. Damn, I’m feeling manlier just reading those. These approaches and other variants have been improved to allow creation of finer features, but there’s only so far you can go with them. In their place you will increasingly find more familiar-looking (to us) photolithographic approaches.

Silicon and glass can be used as substrates, but they’re rather more expensive, and there is a lot of work being done with organics to lower the cost – as is the case with electronics. The materials may sound more exotic – things like PMMA (which we saw being used for directed self-assembly) and epoxy photoresist sheets – substances that may be used as sacrificial materials in an IC, but here are used as structural materials.

In fact, in the small exhibit area of the show, mask-making companies were probably over-represented; they see this as a way to diversify their business at a time when the big IC foundries are increasingly taking their mask-making in-house.

Compared to ICs, the feature sizes in a microfluidic device are large and relatively simple; channels that direct fluids, perhaps mixing, perhaps dividing (more on that in a moment), perhaps snaking around, but there’s nothing that rivals the multi-level nightmares that modern SoCs have become. It’s also simpler than MEMS processing; for the most part, there are no tiny moving parts to build and release.

There are a couple of reasons for this. Cost is a big one: many of the applications envisioned for the long-term involve widespread use of consumable or disposable devices. (Although one wonders what “low-cost” means in a hospital, knowing what a simple aspirin costs there.) Bio-compatibility is another concern: the more complicated the structure and the more materials involved, the harder it is to create something that will be safe to ingest or implant or otherwise integrate into a living organism.

Another important material consideration is the channel’s relationship to the fluids. The channel should be free of mechanical imperfections and snags so that bits of whatever is in the fluid don’t get caught or slow down. The molecular affinity for the reagents also matters – do you want the fluids to “wet” the channel or not? Is the channel inadvertently becoming part of the reaction?

Once you’ve got a place for fluids to go, you’ve got to get them to move. And the grand-daddy of motive forces for liquids can play prominently here. Just as electronics and other mechanical phenomena change behavior when dimensions get ridiculously small, so do fluid behaviors. And the capillary “force” is the most obvious evidence of that. It gets water to the tops of trees and it helps the blood flowing through our – duh – capillaries. And with some microfluidic devices, all you have to do is present a reagent at an input and Mother Nature does the rest, sucking the fluid in via capillary action.

There’s an issue, however: anything you do based on capillary action is unidirectional and irreversible. And you really have no control. HP presented a long list of possible alternative micro-pump technologies, many of which have challenges relating to cost or system integration associated with them. These include reciprocal, rotary, pneumatic, membrane, electro-osmotic, displacement, acoustic, electrowetting, and piezoelectric pumps.

HP further described its own line of work, which uses bubbles. A heater creates a bubble in the fluid, after which the bubble collapses. If the space is asymmetric (the bubble being closer to one wall than another), then the resulting inertial force pushes more in one direction that the other. Repeatedly creating and destroying bubbles gradually moves the liquid in the direction desired.

Once you’ve got the fluid moving, you need to control its flow. That may mean bringing together two different fluids, or it may mean taking a flow of cells, for example, and sorting them according to some property. There are a variety of approaches depending on the sorting mechanism. Forests of pillars can act to sort by cell size. An optical device may identify a cell for one bin or another, and an electrical field may then be used to direct the cell into the appropriate bin.

The next problem is getting a reaction to happen. In the simplest cases, this involves pushing the fluids past features that have been “functionalized.” While much of the material in a microfluidics device is simply structural, key areas may be chemically modified to make them useful in other ways. It’s the equivalent of, “Don’t just sit there looking pretty; do something!” So molecules might be added to recognize or attract a particular substance in the flow – perhaps a specific protein. In this case, the reaction occurs between the fluid and some fixed part of the structure.

Another approach is to mix reagents, and there’s an intriguing idea being developed to improve reactions and their rates. Rather than using streams of reagent, the reagents are encapsulated in drops within an “opposing” liquid. Drops of hydrophilic material (miscible in water) might be suspended in a hydrophobic fluid like oil (or vice versa). The idea behind this is deceptively simple: reaction rates are proportional to reagent density. Higher density means faster results. You can increase density either by adding more reagent to an existing volume or by decreasing the volume. And a droplet is a very small volume.

So, for example, you can take two droplets, each containing a different reagent, and then get them to fuse, mixing the reagent and enabling the reaction in the tiny combined droplet. Droplets can also be split with some degree of success, and they can be directed much like blood cells. They also provide a level of isolation, making it harder for external materials (“wild” species) to contaminate the reaction or, in the case of cell growth, out-compete the desired cells.

Finally, the way in which results are detected can vary widely, according to the problem at hand. One common way is to add a fluorescent “label” to cells to provide an optical signal. Cells and droplets can be sorted, as mentioned before. There are other sorting mechanisms that take advantage of adhesion or even low-level magnetic fields.

All of this comes together into what feels like a completely new world to a digital denizen like me. The devices look different, the applications are different, and the underlying technologies have different roots, but it’s converging in the same way that MEMS has towards similar fabrication techniques and, perhaps, towards integrating electronics where possible to shrink the implementation of things like heaters or pumps as well as to detect results. We’ll be keeping an eye on this as part of our overall MEMS coverage.

 

* OK, things are becoming somewhat more statistical even for digital logic as things shrink to the absurdly small, but never mind that for now…

 

More info on conference participants:

Microfluidics 2012

2 thoughts on “Going With the (Fluid) Flow”

Leave a Reply

featured blogs
Mar 29, 2024
By Mark Williams, Sr Software Engineering Group Director Translator: Masaru Yasukawa 差動アンプはã1つの入力信号ではなく2つの入力信号間の差にゲインをé...
Mar 26, 2024
Learn how GPU acceleration impacts digital chip design implementation, expanding beyond chip simulation to fulfill compute demands of the RTL-to-GDSII process.The post Can GPUs Accelerate Digital Design Implementation? appeared first on Chip Design....
Mar 21, 2024
The awesome thing about these machines is that you are limited only by your imagination, and I've got a GREAT imagination....

featured video

We are Altera. We are for the innovators.

Sponsored by Intel

Today we embark on an exciting journey as we transition to Altera, an Intel Company. In a world of endless opportunities and challenges, we are here to provide the flexibility needed by our ecosystem of customers and partners to pioneer and accelerate innovation. As we leap into the future, we are committed to providing easy-to-design and deploy leadership programmable solutions to innovators to unlock extraordinary possibilities for everyone on the planet.

To learn more about Altera visit: http://intel.com/altera

featured chalk talk

Switch to Simple with Klippon Relay
In this episode of Chalk Talk, Amelia Dalton and Lars Hohmeier from Weidmüller explore the what, where, and how of Weidmüller's extensive portfolio of Klippon relays. They investigate the pros and cons of mechanical relays, the benefits that the Klippon universal range of relays brings to the table, and how Weidmüller's digital selection guide can help you choose the best relay solution for your next design.
Sep 26, 2023
23,719 views