Although I expected slightly different format of meeting (selection of few topics and their detailed discussion with the attendants) I was able to find some new ideas and directions in current and future analytical chemistry.
The speakers (Robert Stevenson, Tom Jupille, and David Sparkman) presented their views about new directions in analytical chemistry, liquid chromatography, and mass spectrometry.
Robert Stevenson – New technologies in analytical chemistry
Robert Stevenson started with the importance of semantic technologies and data processing and control. With emerging techniques more and more data is acquired and analyzed. In comparison to “regular” database with x and y coordinates, semantic analysis allows to add another flexible and dynamic dimension(s) using RDF framework.
Next, he focused on high content analysis – relatively young technique (6 – 7 years) connected mainly with fluorescence microscopy and further data analysis. Using this technique human stem cells can be tested and thus testing on animals avoided. The drawback is large data output and higher prize ($0.25/well and in usual experiment 105 to 107 wells is analyzed).
Software enabled microscopy was a last part of Robert Stevenson’s talk. With this approach correction of optical imperfection is done using software. Using this approach, fluorescent photo-activated localization microscopy enables imaging of DNA strains during the cell division. Unbelievable.
Tom Jupille – High performance liquid chromatography
Tom Jupille summarized history of liquid chromatography separation, which started in 1905 by Michael Tswett. There was not any giant step in the instrumental development during first fifty years. Columns in 1955 looked almost the same as those used by M. Tswett and used gravity as driving force for the mobile phase (although they were much bigger).
Oppositely, next 50 years changed the instrumentation completely. New techniques such as HPLC were introduced thanks to rapid development of a high quality engineering and computers technology. Then, instrumentation in 2005 looked like self-standing, robust, machines controlled with a PC producing hundreds and thousands results every day.
And 2055? Miniaturization and Black-boxes. Tom Jupille compared the evolution of chromatographic system to personal computers. Nowadays, only small group of people knows how computers work. Majority of people uses computers in their daily life but has no clue about bits, RAM or binary system.
The same applies for chromatographic instruments. In the future, people are going to use LC system and they may not even notice using one. For them, it will be the only way how to reach a result– composition of the sample, level of compound of interest, sample preparation for further analysis and so on.
My (footnote) heretical question: is liquid chromatography sample preparation technique for mass spectrometry?
Future of HPLC instrument began with introduction of Dionex ICS 5000, which is a system with minimal extracolumn volumes and flexible and modular approach. In this I agree with Tom Jupille and his “As instruments evolve, they became appliances” statement. We as method development crowd should provide end-customer with required technology/instrument/column. And end-customer does not need to know value of efficiency at minimum of van Deemter curve.
The general trend in column development is reduction of particle size. From tens of micrometer on the beginning of HPLC history to current highly efficient sub 2 μm particles. The advent of ultra high pressure liquid chromatography (UPLC) is closely related to pressure issue of HPLC columns packed with small particles (high pressure is a prize for speed and efficiency).
The question is is there always need for very high efficient columns. No. We need resolution and selectivity (as Peter Schoenmakers mentioned during HPLC 2010 and Tom Jupille during his talk).
Another issue related to HPLC columns packed with very small particles is contribution of extracolumn volumes to a separation (I already mentioned it a bit). By reducing extracolumn volume you can very easily improve separation power of your HPLC system with very low or no additional cost. However, the sub 2 μm superficially porous particles are probably going to be future packing material.
Monoliths, on the other hand, have efficiency as 3 μm particles with back-pressure of particles with 10 – 15 μm. According Tom Jupille, they are not going to dominate over other separation materials mainly because of patent issues.
From my point of view, I see room for monoliths in special application using stationary phases tailored for certain separation problem with desired selectivity (as well as efficiency and back pressure). Another advantage of monoliths is easy preparation in special format of separation devices (lab on chip).
I believe that 2nd generation of organic polymer monoliths (hypercrosslinked materials) may contribute significantly as a new family of stationary phases. I call them superficially porous monoliths.
David Sparkman – Mass spectrometry as a separation technique
I had an opportunity to share a table with a last speaker – David Sparkman. He likes the Czech Republic, which he visited during an international conference a few years ago.
David Sparkman said that mass spectrometry can be considered as a separation tool since the beginning of this technique. Main improvement of MS detection is attributed to development of MS/MS detection in 1978 . Surprisingly, in 1990s the GC-MS technique almost disappeared.
Nowadays, the main driving forces in development of mass spectrometry instrumentation focus on tandem quadrupole arrangement and/or ion mobility mass spectrometry. In later case, the ions are separated not only using MS, but also according their different mobilities based on size, charge and so on.
During his talk David Sparkman described later development in the field of mass spectrometry instrumentation. He started with Waters Xero TQ with quadrupole – collision cell – quadrupole arrangement, which improve signal to noise ration and allows analysis of femtograms of sample.
AB/Sciex developed Triple TOF, which is NOT instrument with three time-of-flight analyzers, but it looks like the analysis is done using three TOF systems. Triple TOF uses 40 GHz time to digital detector which allows very high sensitivity and detection.
QIT (Thermo, I believe) is quadrupole – ion trap mass spectrometer with dual cell arrangement (low and high pressure) and shows very high resolution.
Because of time, David Sparkman only quickly mentioned other producers, such as Bruker and his maXis, Shimadzu with LCMS 8030 system or Agilent 6490 using jet-stream technology.
From chromatographic point of view, the most important characteristics for mass spectrometry are robustness, repeatability and reproducibility of chromatographic separation for subsequent analysis with mass spectrometry.
“If you do LC and don’t know MS you may be out in the cold” David Sparkman
During the discussion I have asked about the sample preparation. With decrease in time of anlysis (seconds, minutes) the importance of fast and robust sample preparation increases. If your analysis is 5 minutes, you don’t want to prepare your sample for one hour. According the panelist this topic needs special attention, however because of time they did not elaborate on this topic further.
I am glad to be here in Bay Area and attend such meetings. I believe that in the future CASSS Discussion groups can be webcasted on the internet (live or for download). Since then you need to trust me ;-)
I would be more than happy to discuss your opinion about future of analytical chemistry (and separation techniques in particular). Please feel free to comment on this article. If you prefer more open communication, you can use chromatographer.com facebook page.