Film vs. Digital: A Real Western (Blotting) Showdown

by on 27th of April, 2012 in Protein Analysis, Detection & Assay
Avatar of Jason Garner
About Jason Garner
I have what some might call an eclectic background. Originally a music major (yes, I wanted to be a rock star), I spent the first 17 years of my professional career as a technical advisor in the car business for Acura, BMW, Mazda and Volkswagen. Twelve years ago, my mother’s bout with cancer brought out the scientist in me. I’ve been lucky enough to earn two degrees in cell and molecular biology and have a passion for virology and immunology. I’ve also been honored to serve as senior molecular biologist for the US Department of Defense’s Global Influenza Surveillance Program from 2006 -2011 where we made great strides in flu surveillance and vaccine development. Our model was so successful, I started my own infectious disease and molecular diagnostics consulting company, Movira Sciences & Diagnostics (still trying to get it off the ground ;) ). I count many global scientists and biotech professionals as friends and colleagues with whom I’ve co-authored many journal articles, presentations and abstracts. I currently work as an Applied Testing Applications Scientist with QIAGEN, Inc. and serve as an infectious disease moderator for LabRoots.com. I play guitar and yes, I still want to be a rock star.

You’ve masterfully run and transferred your gel, and now it’s time to probe and quantify your protein(s). You’ve got your antibodies and ECL ready to go. Substrate – check. Film cartridge – check. Darkroom – ah yes, that magical place where night vision goggles are required to navigate a veritable minefield of potential chaos. It didn’t seem so imposing when the lights were on, and plus, you’ve brought a friend to help. But now the lights are out, you can’t find your film cartridge, and, while awkwardly grasping in the darkness for your blot, you’ve managed to grope your startled colleague. Once the shrieking subsides, you may be able to locate the developer and develop your blot…but your colleague will never go back in the darkroom with you again.  Isn’t there a better way?

Since the late 1970’s, chemiluminescence has proven to be a reliable and highly sensitive method for detecting and quantifying proteins.  It’s also a much safer alternative to radioactive probes, which require you to wear enough Nomex to dance at Chernobyl. While there are a number of factors affecting the performance of chemiluminescent imaging (e.g. quality of materials, proficiency of the user, strength of the protocol, etc.), the wide dynamic range and rapid results have made this the method of choice for most labs over the past 30 years. With the advent of charged-coupled device (CCD) camera technology, new options for detecting, analyzing and capturing images are readily.

So, with western blotting so prevalent in so many different labs, which is the ‘better’ option – traditional film development, or CCD imaging? That is a difficult question to answer – needs, practicality, expertise, funding and applications vary from lab to lab. Let’s look at the individual inherent strengths and weaknesses of the two methods. As a disclaimer, this information was compiled over years of experience and to a lesser extent represents this author’s humble opinion.

Sensitivity / Dynamic Range / Limits of Detection

Using film to develop blots has traditionally been associated with a “visual assessment” of relative band intensities. In other words, you hold the film up to the light, cock your head like a dog’s at the sound of the word “treat”, and strain your eyes as if trying to decipher the fine print on the back of credit card application. “Eyeballing it” may be fine if you’re looking for the presence or absence of a signal, or if the signal intensities you are comparing differ significantly (unless, of course, you forget to remove your sunglasses). However, Captain Obvious would point out that this type of analysis has severe limitations and is not useful for precise signal quantification.  You can quantify signals on film by scanning the film image and measuring the density of the bands, but accurate measurement of band density ultimately depends on the sensitivity of the film, linear response range, and exposure time. In most cases, signal saturation can be reached quite rapidly, resulting in dark diffuse bands on film. Once saturation has been reached, the change in signal intensity can’t be measured. Translation? You’re out of luck, my friend.

A CCD-captured image will typically be able to detect chemiluminescent signals distributed over a much broader range. Moderate signals, which would have reached saturation (i.e. appear completely black) on film, may still be in the middle (gray) range of the dynamic range of the digital imager. A digital imager typically records from 4,096 to 65,536 different levels of intensity, thus covering a greater dynamic range of gray scale. When prepared for publication, blot images are generally displayed over 256 levels of gray. Thus, the dynamic range digital imagers are on average 2-4 orders of magnitude higher than film.

Saturation of signal on blot images obtained with film can give the perception that film is more sensitive over digital imaging. However, the Achilles heel of film is the higher LOD (limit of detection) compared to that of digital imaging. Weak signals need extensive exposure time to be detected and quantified on film. Conversely, digital imaging systems, because of limitations on integration time, are not as sensitive as films that are exposed for extended periods of time. Bottom line? Film will take less time to develop high intensity signals, but digital imagers, because of their lower LOD, will capture less intense signals that are missed by film and do so without compromising stronger signals to saturation.

Cost

No matter which way you slice it, developing western blots is expensive. For 5 labs developing approximately 10 blots/week, with around 4 exposures per blot, the costs associated with maintaining a dark room, using a developer, purchasing the developer chemicals and of course, film, can run about $16,000/year. Imagers with chemiluminescence capability start around $13,000 and top out around $40,000. The vast price range is due to the different technologies/options that can be included, the quality of the optics, software etc. Even given the higher prices on some units, digital technology wins here, as benefits also include versatility, ease of use, all-in-one packages and the fact these imagers will pay for themselves in as little as a year.

Bottom Line

As scientists, we are, by definition, a fussy, routine-driven and skeptical lot. Many of us have a hard time of letting go of old technologies and embracing new ones. Many will say, “Film just works, its what I’ve always used” and they’d be right. When used properly, film just flat out works. But many are embracing the idea that your western blot data, analyses and images can be more effective. Digital imaging systems provide a convenient and practical solution for imaging and performing quantitative analysis of chemiluminescent blots all in one package. Most systems are capable of providing better linearity between the amount of protein and the signal intensity over a broader dynamic range, with higher sensitivity than film over shorter exposure times. In addition, imaging systems are less expensive over time. Compared with X-ray film, successful imaging of chemiluminescent blots using a digital imaging system is fast, accurate, and reproducible.

What about you – do you prefer to use film or an imager to develop your blots?

Photo Credits

Speak Your Mind

Protein Analysis, Detection & Assay