Bypassing 164 years of tradition with ‘any immersion microscopy’
Posted by Alfred Millett-Sikking, on 24 April 2023
The optical microscope is a classic scientific instrument with a straightforward purpose: to observe objects in more detail than is possible with the naked eye. Many microscope variations exist, from the rudimentary examples of the 17th century, to modern computer controlled systems with sophisticated designs. Despite the variety, most optical microscopes reuse the same physical principles and key elements, including an ‘immersion medium’.
Here we revisit the basics for a broad class of microscopes and discuss what we mean by ‘immersion medium’, why it is important, and how it has traditionally been constrained. We then present a new approach for ‘any immersion microscopy’, which offers new microscope design options and new possibilities for imaging 3D objects. We hope to introduce less experienced readers to the key concepts and leave the more thorough treatment to the original article [Millett-Sikking 2022].
Traditional immersion microscopy
We’ll start by considering a basic microscope and explaining how it operates. Let’s imagine our goal is to see ‘a small bright 3D object with fine details’ (e.g. a fluorescent cell). The object emits light in all directions but is too small to see by eye, so we’ll use lenses to collect some of the emitted light and create a magnified image. We can then look at the larger image by eye or with a camera. However, if we want an image with fine detail, then we need to collect light from the object over a wide angular range (i.e. a diverse set of directions), which requires a special lens called a microscope objective.
A microscope objective can resolve small details in the object, and also make bright images due to the high angular collection. However, the high angular collection also reduces the depth of field in the object (i.e. the axial extent over which the object will appear in focus in the image), so a microscope objective will only show objects in focus that are within a shallow range of distances from the objective, often just a few micrometers. We usually think of this shallow depth of field as the focal plane.
We now have our first problem. We set out to see a small bright 3D object with fine details, and instead we can only see a 2D slice of the object with fine details (because the high angular collection of our microscope objective has restricted our view to a shallow focal plane). So how can we see an object with fine details in 3D? The traditional fix for this problem is to adjust the axial position of the focal plane, and then build up a 3D volume with a series of 2D images taken at different depths in the object. For example, on a traditional microscope we usually move the objective (or the object) up and down on a linear stage, a process we will refer to as ‘standard focus’.
So, it would appear we can now image fine details in 3D with our microscope by using the high angular collection of the objective and standard focus. However, this leads us to our second problem. An objective only images well if the space between the front of the objective and the focal plane are configured in an exact way. Specifically, the focal plane must be a fixed distance from the objective, and the intermediate space must have at least one predetermined refractive index (RI). For example, an ‘air’ objective is intended to image in air (RI 1.0) and introducing something other than air between the objective and the focal plane will degrade the image. This also applies to ‘water’ (RI ~1.33) and ‘oil’ (RI ~1.51) objectives, which only image well into the said medium. In fact, all objectives with high angular collection can be considered ‘immersion’ objectives, which means they only image well into the specified immersion medium.
This second problem with our microscope now arises. We intend to use ‘standard focus’ to adjust the focal plane and image in 3D. However, when we use standard focus we inevitably put layers of the object between the objective and the focal plane, which changes the optical configuration. So, for example, if we use an ‘air’ objective to look into a watery object, then the image will degrade as we move the focal plane deeper into the object, since we are removing layers of air and adding layers of water. This is a common problem in microscopy, where the sample RI differs from the immersion RI, which we refer to as ‘immersion mismatch’.
However, if we use a ‘water’ objective to look into a watery object, the image will not degrade as we move the focal plane deeper into the object, since we are removing layers of water and replacing those layers with water (so the optical configuration is maintained). Therefore, when the object medium and the immersion are the same, we can get high quality 3D imaging, a special case we will refer to as ‘immersion matched’. The importance of immersion was first realized in the mid 19th century, and the first objectives designed for a medium other than air (like water or oil) were presented in 1858, which we now refer to as immersion objectives.
Immersion objectives enable high quality 3D imaging in two principle ways. Firstly, by maintaining the correct optical configuration for an immersion matched object (as described above). Secondly, by allowing the highest angular collection of light from the object. For example, if we use an ‘air’ objective to look at a watery object, then the angular range of the emitted light is limited by total internal reflection (TIR) as it leaves the object and enters the air medium. However, if we use a ‘water’ or ‘oil’ objective to look at a watery object then there is no object light TIR, and so we can collect light over a higher angular range, which gives higher resolution.
So, we can achieve our ‘goal’ using immersion objectives, and use our microscope to see small bright 3D objects with fine detail. However, we are restricted to using a specific type of immersion objective for objects in different immersion mediums, which is highly impractical! Microscope objectives are complex to design and build, and expensive to buy, so having an immersion objective for every conceivable immersion medium is not realistic. Some specialized objectives can be tuned to different immersion mediums over a limited range. However, in general, we limit the immersion medium choices to air, water, and a few kinds of oil, which we can think of as the standard immersion mediums.
A typical way to alleviate the problem of immersion mismatch is to use a thin layer of glass to separate the intended immersion (like air, water, or oil) from the object. This layer of glass can also serve to isolate samples from the objective, and from other samples, which has additional benefits. However, for the purposes of imaging in 3D, the glass layer can ensure that the space between the objective and the focal plane is mostly of the correct immersion, so only the depth we image into the object can create an immersion mismatch. The glass layer, often referred to as ‘coverglass’ or ‘coverslip’, is incorporated into the design of the objective so it does not disturb the optical configuration. We usually refer to this kind of lens design as a ‘coverslip corrected microscope objective’.
In summary, to see small bright 3D objects with fine details we can use a coverslip corrected microscope objective and attempt to match the immersion medium with the object. This is hard to achieve in practice, since even small changes in the refractive index of the object (e.g. 0.01) can affect imaging performance, and we are typically limited to the ‘standard immersion mediums’. If we successfully match the immersion to the object, then we get high quality 3D imaging. If we don’t match the immersion to the object, then we accept that the images will degrade as we look deeper into the object. This has been the tradition for the last 164 years.
Any immersion microscopy
So far we have considered a traditional microscope, where a primary objective lens near the object is often used with a secondary lens called a ‘tube lens’ to produce a high quality 2D image with a large positive magnification (for example 100x). The magnification is of course by design so we can view a large 2D image of the focal plane. However, for bright 3D objects, light is also being emitted above and below the nominal focal plane, and this light is also collected by the microscope objective. So actually, for a 3D object, it is more correct to consider a 3D image.
Unfortunately, the deliberate magnification in our traditional microscope also has the side effect of distorting the 3D image. In particular, a large positive magnification can produce an elongated 3D image with substantial aberrations away from the nominal focal plane, so although the 3D image exists in some form, it is not useful in most situations. However, in the special case where we apply ‘uniform’ magnification in all directions it is possible to avoid these distortions. This was first proposed by Maxwell in 1858, which happens to coincide with the year the first immersion objectives were showcased. We can think of this uniform magnification as creating a ‘perfect 3D image’.
A 3D image is an attractive proposition. For example, we can imagine avoiding ‘standard focus’ (keeping the primary objective and object stationary), and then exploring the 3D image with other optics that move elsewhere. Indeed, this was the brilliant idea of Botcherby et al in their 2007 publication, where they proposed a method using 3 microscopes in series. In this seminal work, they used the first 2 microscopes to achieve the uniform magnification condition proposed by Maxwell, and the last microscope to magnify different focal planes of the remote 3D image onto a camera. In general, we will refer to this kind of setup as a ‘remote refocus microscope’.
Remote refocus microscopes can quickly access different focal planes in a 3D object without disturbing it [Millett-Sikking 2018] and have become a key method in high-speed 3D imaging applications like single-objective light-sheet (SOLS) microscopes [Millett-Sikking 2019, E. Sapoznik 2020, Chang 2021, Yang 2022, Chen 2022] and other foundational variants [Dunsby 2008, Bouchard 2015, Yang 2018, Kumar 2018]. However, despite the variety of publications, many of these remote refocus microscopes share the same fundamental oversight.
When we design and build a remote refocus microscope, we have to decide how to achieve the uniform magnification we need for a 3D image. This means selecting a series of optics that can collect light from the object and create an image with the same magnification in all directions. However, because lateral and axial magnification scale differently, we can only create a good 3D image under a precise, predetermined magnification. Specifically, we must set the magnification equal to the ratio of the refractive indices of the object and image spaces, which we will refer to as the ‘remote refocus condition’.
In practice, the remote refocus condition is easy to quantify. Typically, we choose to form the 3D image in air, and so the denominator in our refractive index ratio is 1. For biological samples, the object medium will sit somewhere between water and oil, and so the numerator in our refractive index ratio is in the range 1.33-1.51. This means, depending on the biological sample we choose, our remote refocus microscope can produce high quality 3D images if we set the remote refocus magnification in the range 1.33-1.51.
Now for the fundamental oversight. Until now, in coverslip corrected remote refocus microscopes, it was assumed that the object medium would match the immersion medium of the primary objective, and that the remote refocus condition should be set for this medium. For example, if the primary objective is a water immersion objective, then the remote refocus magnification would be set to 1.33, the refractive index of water (assuming a 3D image in air). This is a natural approach since we know that if the object and immersion medium are matched then ‘standard focus’ will work well. However, unlike standard focus, remote refocus does not actually require matched immersion to create a high quality 3D image. Specifically, the remote refocus condition makes no assumptions about the immersion medium, and applies strictly between object and image space.
This is a powerful concept. A remote refocus microscope can create a 3D image of an object regardless of the immersion being used by the microscope objective. For example, we can set up a microscope to use an air objective with a watery object. Standard focus will rapidly degrade the image due to the strong immersion mismatch, but we can configure the remote refocus optics to look deep into the water! Alternatively, we can use an oil objective with a watery object. Again, standard focus will degrade the image, but the remote refocus optics can still work meaning that a remote refocus microscope can be used with any immersion objective.
In addition to a free choice on the type of immersion objective, this concept also enables better 3D imaging in objects of different mediums. Because the remote refocus condition is not tied to the immersion of the objective, it can be tuned precisely to the object. Specifically, the magnification of the 3D image can be adjusted to match different objects with different refractive indices. In practice this can be done with a zoom lens over the whole biological refractive index range 1.33-1.51. So, for homogenous objects, we can adapt to any object medium.
The ability to image in 3D with different immersion objectives offers new microscope modalities that are not possible with a standard setup. For example, we can use air objectives to look deep into liquids and utilize the air gap for thermal isolation, high speed tiling and convenience. Alternatively, we can use oil objectives with any liquid and get improved angular collection of light, and less sensitivity to coverslip thickness compared to water or air objectives. In addition, we can now get high quality 3D imaging in objects with any refractive index, which bypasses the old traditions of standard focus and the need for immersion matching. These new methods now offer an advanced platform for many 3D imaging systems, and together we can think of them as enabling ‘any immersion microscopy’.