Synthetic oligonucleotides are increasingly central to modern research and therapeutic development. But their analysis via LC/MS often relies on large volumes of hazardous mobile phase additives like HFIP and TEA. HFIP, a PFAS-class compound, is particularly concerning due to its environmental persistence and negative impacts on human health. As sustainability becomes a growing priority across analytical labs, there’s a clear need for methods that reduce solvent and modifier use without compromising data quality.
In this webinar, scientists from Axcend and Agilent Technologies will present a collaborative study demonstrating how a standard oligonucleotide LC/MS method was successfully transferred to a compact, capillary-scale system. The result: a >99% reduction in HFIP, TEA, and methanol consumption, while delivering the expected chromatographic resolution and mass spectral accuracy.
In this webinar, you will learn more about:
Whether you're working in pharma, environmental testing, materials science, or any field where liquid chromatography plays a role, this session will offer practical guidance on greening your workflow without losing precision.
Samuel Foster, Ph.D.
Application Scientist
Axcend
Samuel Foster completed his Ph.D. in Pharmaceutical Chemistry from Rowan University in 2025. His research has focused on the development and application of capillary scale liquid chromatography instrumentation. He currently works at Axcend as an application scientist focusing on the development of chromatographic workflows for a variety of analyte classes including oligonucleotides, monoclonal antibodies, and drugs of abuse.
Lee Bertram
Technical Product Manager
Agilent
Lee Bertram currently serves as Technical Product Manager for LC Triple Quadrupole Mass Spectrometers at Agilent Technologies. With over a decade of experience in analytical chemistry, Lee has held roles in both research and product development across biotech startups and global life sciences companies. His background includes extensive work in method development, GMP compliance, and LC/MS applications spanning quadrupole, triple quadrupole, and Q-TOF technologies.
Sam Foster:
Hello everyone, and thank you for joining us today. I'm Doctor Sam Foster: from ascend. I'm an application scientist here, joined by Lee Bertram:. And, this is a joint study we've done between Extend and Agilent, where you're talking today about reducing fast use and oligonucleotide analysis.
So to start off, because we're going to be talking about it a lot. I want to set the stage by defining what capillary scale is. So traditionally chromatographic scales are decided based on the inner diameter of the column you're using. And that ties directly into the flow rates that you operate it with. So your standard analytical scale comes in at 4.6mm of inner diameter.
Typically you operate these about two milliliters a minute, sometimes a little slower, sometimes a little faster. But that's sort of your golden range. Now over time, a number of different sized, columns have been introduced. The solvents are coming in at three millimeters. And this was designed to basically cut use of half, narrow bore. And these are sometimes, HPLC columns.
Those are 2.1mm inner diameter, typically run between point two and point five microliter or milliliters a minute. But down now in the point 1 to 0.3 millimeter range is capillary scale. And so we see a very significant reduction in mobile phase consumption now operating between 1 and 10 microliters a minute. And so when I mentioned capillary scale today we're going to be thinking about point one to point three millimeter ID is run at the micro liters per minute scale.
So really this all started many years ago with the goal to make smaller, lighter and more portable instruments. This is sort of a history of compact LC. And I don't necessarily say capillary scale because, a lot of the earlier ones weren't, but really the first compact LC came out in 1983. It was 42kg. So while it was small, it wasn't necessarily portable.
And that had a lot to do with, the larger pumping system, the need to accommodate higher solvent volumes. You're carrying around liters of mobile phase. And so it, it was small and form factor was very heavy and not necessarily the most practical. Really what I want to highlight, though, is that over the years, we see these systems get smaller and smaller and smaller and really starting in the early 2000s, with the introduction of more widely available capillary columns, we start to see a very dramatic reduction in our overall size and also our overall weight.
We see now most of them coming in below ten kilograms with our modern day accent coming in at eight kilograms total. And really, this is a lot to do with just the fact that we switched to capillary. You need far less mobile phase. You can use smaller columns, smaller pumping systems, smaller tubing. And so switching down to these smaller scales really lets you start to do, some, some very significant downsizing on your instrument.
So I mentioned it before, but this is the extended focus. LC it's a fairly high pressure at 10,000 psi or 670 bar, dual syringe pump based, system. We're able to operate, fully portably. So we have a battery inside that runs, for ten hours before it needs to charge. You can also just plug it into a wall and run it as a benchtop system.
It's a small form factor as we'll get into in a little bit, but I want to draw your attention to the front of that system with these three different vials you see there, AB and W. So those are your mobile phase vials. That's, mobile phase A, mobile phase B, and then your overall solvent waste. And so thinking about the flow rates that we typically operate on, the extend is really only using a few microliters a minute.
And I'll say from experience that I tend to have to replace the mobile phase because it's gotten old rather than because I've run out of it. You know, you need to prevent algae growth or things like that. And so, I tend to have it age out rather than run out. And so it's a very, very useful system for, for small volume consumption.
Additionally, because it's a very compact system, we have a much lower benchtop footprint. And this is great in a lot of space, limited environments. Not only are we reducing your solvent use, but we're also reducing just the actual size of it. And so, we have a number of cases where customers have placed them inside a few modes or, you know, in the back of trucks just on carts anywhere that you can think that, you know, you'd want NLC, but maybe not necessarily able to fit a standard LC, you can start to look at integrating, fully comparable HPLC into these smaller form factors and into setups and locations where it otherwise would be rather impractical. So just to reiterate and really hammer home how much solvent we're actually saving here. This was a customer study that we had done where they, did the same method on a HPLC, a standard big box system, and on the end focused LC, and then calculated what that overall usage and cost per year would be.
And what we see is that on the standard big box HPLC, we generate about 100l per year, which comes to a total solvent, you know, production and waste disposal cost of about $28,000. So pretty high per instrument. If we then take a look at that same method, translated down to the capillary scale, we see that it now generates point two liters.
So not even a full liter and only cost about $27 a year. So a very dramatic reduction not only in your, solvent generation, but just in terms of the cost it takes to actually run the system. Now, systems that can produce solvent are great, but we also want to make sure we're actually getting repeatable and usable data.
That way we can integrate it into existing workflows. So for this we used an Agilent Eclipse Plus C18 column in the capillary scale. So a point three millimeters in diameter. And we performed just for separation of alcohol for knowns. We did a single day and a multi-day study in which we did 30 injections across a single day and, five injections per day, across three days.
Typically, you want to try and keep your relative standard deviation below 1%. And what we can see is that we hit that mark perfectly fine with our single day coming in, at .18 3% and our multi-day coming in at 0.706%. So both systems are both single and multi-day. Repeatability falls within acceptable thresholds. And we're able to perform that down on the capillary scale pretty easy.
So if you're looking at this wondering how you can actually start to take your methods and translate them to the capillary scale, I think that USP chapter 621 is a really valuable resource for that. Some of their guidelines are made specifically for USP monographs. And so if you're looking to operate in a non-regulated environment, you don't necessarily have to follow all of them to the letter.
And there are a number of papers detailing, very similar things that you could do, that don't necessarily have to fall within the USB criteria. That being said, I think it's a great resource, and I highly recommend taking a look at it. Because it really does give you basically everything you need to generate comparable data between those scales.
It allows for a number of different changes. So with your column dimensions, it wants you to keep within a specific length the particle size ratio. This is tending to keep the efficiencies the same. And so while you can adjust the inner diameter all you want, you really want to make sure your columns kind of are very similar to each other.
Then it talks about how you can scale the flow rate so you can keep a comparable linear velocity and therefore, you know, a comparable separation without or you're able to do that between the different scales. So how do you take a couple mils a minute and turn it down to microliters a minute. They have guidelines for that.
After you've scaled the flow rate, they talk about how you can change the gradient programing to account for, differences in dwell volumes between the system impacts of pre and post column broadening, just differences in flow rate in general. And so it gives you great criterias there on how to keep an identical gradient injection volume. This was one that is a little bit tricky as you go to capillary scale.
Because a lot of these columns can be overloaded very easily. You have to keep, a very close eye on your actual injection volume. And this is something that, USP themselves recognizes, and they leave quite a bit of flexibility as long as you meet system suitability criteria. So that's one that you have to play with a little bit.
But they give you a very, very wide, room to play with. There are a number of additional changes that USP specifically allows. Beyond that, if you're looking to operate with a non validated or non-regulated method, really, you know, the sky's the limit. There are tons of different changes and modifications you can make. And it boils down basically to adapting your system to the needs of the separation.
All that being said, here was a directly translated USP monograph to show you that it works. We see an analytical scale separation for hydrochlorothiazide and its two impurities shown there in the smaller box. And then we see it over on the capillary scale. And again, I want to mention there was some differences, in absorbance. And that has a lot to do with just the volume injected.
This was a directly scaled injection volume. And because we had slightly different path length to an analytical scale detector, we had slightly different sensitivities. It's easily rectified by simply using a longer path length flow cell, or in fact injecting more, which is still fully allowed within the USP. But this was a direct 1 to 1 that I wanted to show that, you know, you do still have to pay some attention to those injection volumes as you change.
That being said, while we saw a little bit less sensitivity in the UV for a number of factors, really where capillary starts to shine is in, mass spec and specifically in electro spray ionization. So in electro spray ionization, as you start to go to these lower flow rates, you get better ionization efficiency because you're making smaller and smaller droplets of solvent that then evaporate quicker and better, ionize your, your analytes of interest.
And so I think this is a really great example where you can see that at the low flow, we have far less baseline noise. We're able to start to see a lot of those, smaller, you know, compounds down there that otherwise would have been lost completely to the noise. Whereas if we look at some of these higher flow rates, you really start to lose everything but your main base.
And so, electro spray ionization with low flow is a really, really powerful technique for increasing your sensitivity. To follow up on that, this was a second joint study we had done with Agilent some time ago for peptide analysis. And by comparing a standard flow system running at half a milliliter a minute or 500 microliters a minute to a capillary scale system at two microliters a minute, what we're able to see is a roughly ten fold increase in our overall sensitivity by simply reducing our flow rate, keeping the same column, the same sample.
And really, we get a dramatically increased, overall sensitivity because of that ionization efficiency. So, a lot of what we're going to be talking about today is leveraging those low flow rates and those reductions in solvent consumption, not only to reduce P fast, but to produce better data.
Lee Bertram:
For the past ten years, Agilent has been innovating year after year with new products in the quadruple market. Whether it's single quad, small form factor, triple quad or large form factor triple quad starting all the way back in 2015, the 6125, which offered orthogonal detection and seamless integration into your HPLC all the way up to our flagship triple quad, the 64 and 95 D, released in 2023 with best in class sensitivity and robustness.
Many of these instruments have won awards for making the lives of scientists easier and providing clearer results for your lab. The accumulation of these instruments, and the past ten years of innovations at Agilent have led to the next generation LCMs single quadrupole systems, the Agilent Pro IQ and the Pro IQ plus. The Pro IQ has been designed from the ground up to provide you with high performance and maintainability.
The system supports the Agilent Jet Stream ion source, providing the highest sensitivity and ensuring that you miss nothing when you analyze your samples. The VAC shield allows for maintenance of the system to be faster and easier than ever before, and we completely redesigned the quadrupole, an electronics, to allow for extremely fast polarity switching and stay stable. High mass ion transition making sure that you're able to analyze every compound that comes your way with ease and confidence.
As I mentioned earlier, this system was designed with you in mind by providing superior sensitivity as low as single digit femto gram on column, we are confident that you will never miss a compound in your samples. With large molecules becoming ever present in all analytical labs, you must have the ability to analyze everything from sub 200 molecular weight compounds all the way up to large biomolecules.
We tested high mass ion transmission stability over a long period of time, varying temperatures in the lab to ensure that the Pro IQ can meet the needs of the modern lab.
Hardware is only half of the story for a scientist to be able to process data and accurately report results, you need compatible software tools. The Pro IQ is powered by an Open Lab CCDs, which offers many solutions to help and aid in the processing and reporting of data. Here, we're taking a look at the results from the automated Automatic Deconvolution feature that allows you to get quick molecular weight confirmation of biomolecules.
This feature is sequence enabled and allows for quick and easy confirmation of your samples. Injection by injection without the need for manual processing at the end of your run. With our latest quadrupole instrument, the Pro IQ, you're able to equip your lab with the latest in mass spec innovation, ensuring that you're able to tackle any samples that come your way, whether it's small molecules or large oligonucleotides.
Our two instrument options allow you to decide what is best suited for your lab needs.
Sam Foster:
So for this study, we're focusing primarily on oligonucleotides, and I think it would be good to take a second to discuss what they are and some of the difficulties that lie in analyzing them. So oligonucleotides are generally short. Typically therapeutics range to about 25 units. They're short chains of these nucleic acids or nucleotides. They have a very wide range of applications from therapeutics to genetic research.
They are really, really useful, compounds that are worth taking a look at. That being said, from an analytical standpoint, they tend to be very complex and come with a number of different challenges. Not only do we have to worry that we're accurately transcribing up to 25 or sometimes hundreds, as we'll show later on, many of these compounds then have structural modifications made from their initial base state to enhance overall stability or efficacy.
And so it can become a very considerable challenge, not only trying to create them, but trying to analyze, separate, and properly determine their mass. On top of all of that, they tend to typically be prescribed and used in the low nano molar concentrations. And so when using something like UV detection, you can often run into limits of sensitivity.
And so for that due to their highly complex nature, their low abundance and the need to validate often very, very large and complex chains, we tend to prefer LCMs analysis for, these more complicated structures. Now, on top of that, we also mentioned that we're going to be doing this to reduce P fast. And so fast per or polyfluoroalkyl substances represent a class of compounds known for their environmental persistence.
We find these just about everywhere. They stick around just about forever and more and more recently we're finding out they may not be the greatest things for human health or for environmental health at that point. The problem is they're really great for chromatography, and that's an issue. PFOA, or tri fluoro acetic acid is a very common mobile phase additive.
It can often improve your peak shape, although it does have some drawbacks using LCMs. It's a very, very widely used mobile phase, additive. And it's something that does fall within the fast rules for oligonucleotide analysis. Specifically, we have something called HEXO fluoro isopropanol or FIP. This is again a p fats compound, which is more primarily used for ion pairing with oligonucleotides.
We can see on the figure there that, the top figure doesn't use HF. IP uses a different ion pairing method of t aa, and we can see lower abundance. And for resolution. Whereas when we switch to hf IP we see higher sensitivity. We also start to see a lot better resolution of these components. And so while we don't like the fact that it's fast, we do like what it does for our overall analysis and sensitivity.
And so our proposition and the goal of this work was to say, maybe you don't have to remove by leveraging the low flow of the extend focus. LC with the high sensitivity of the Agilent Mass spec, we're able to work together and rather than changing our ion pairing and changing our system pairing to produce comparable, and highly sensitive methods for oligonucleotide analysis.
Lee Bertram:
When analyzing oligonucleotides, it is important to resolve compounds similar in size and length. We wanted to ensure that the extend Agilent offering was able to baseline resolve a large latter of oligos, as shown here. And as Sam mentioned earlier, we are using a floral alcohol alkyl amine simple gradient and we're able to achieve the necessary separation of a wide range of oligo lengths from 15 all the way up to 40 Myr.
After ensuring the chromatographic portion of the method is working, we need to dive deeper into the spectral quality and ensure that mass Spec is giving us the data we need. Here you can see the spectra of the first three oligos from the previous slide. The charge state distribution is clear and easy to discern, thus allowing for our automated deconvolution feature to easily provide you with the mass of the oligo.
On the next slide, you can see the same thing repeating, but with oligomers of over twice the size being D convoluted with less than 50 rpm accuracy. Namely for both the 35 and 40 mer.
Here we see similar results as before, but with a much larger 100 mer single guide RNA. Usually the larger the oligo, the harder deconvolution, especially with a unit mass instrument such as the Pro IQ. The advances made in hardware and software on this system have allowed for easy deconvolution of this compound with an amazing sub10 mass accuracy. Running standards is great, but what about something more therapeutically relevant?
As the market is not only focused on single stranded oligos, we need to ensure we easily separate and identify denatured RNAs. Here we analyzed Gabor Strand and were able to separate and identify both the sense and antisense strand of the drug product with little to no changes in the methodology stated earlier. By successfully analyzing a wide range of oligonucleotide samples, but the extend LC and combined with the Agilent Pro IQ plus, we are confident that this system will meet the needs of your lab.
Sam Foster:
So to conclude, I wanted to go through some key points for this total, webinar to start an analytical scale oligonucleotide analysis containing HPE IP and TDA was transferred down to Capillary Micro Flow Chemistry using the Extend focus LC. This method, while keeping the PFAs at the same concentration, was able to reduce it 220 fold, or over 99.5%.
Compared to that previously mentioned analytical scale method, the LCMs, the LCMs analysis of the DNA landers ladder standard, the 15 to 40 mark, as well as our 103 Mur and real world samples were also demonstrated. The Agilent Pro IQ plus ensures accurate deconvolution via its high mast transmission efficiency and its spectral quality. Fast reductions through switching the capillary scale allow for sustainable analysis without sacrificing your chromatographic efficiencies.
And through the spectral deconvolution, we showed, a 0.5 molecular or 0.5 Dalton molecular weight assignment from each of the theoretical values.
With that, I'd like to thank everyone for attending. And we'll open the floor for questions. In this case, I will step in as the moderator and convey the questions to either me or to Lee.
All right. The first question is how does the reproducibility of the extend focus LC at capillary scale compared to traditional analytical scale systems over extended runs? Yeah, it's a very comparable repeatability. As I mentioned previously, the goal is, to stay below about 1% relative standard deviation, which we met. It varies. I mean, the 1% varies, application to application.
But, it really is comparable repeatability as far as, we've seen.
Early. This might be a good one for you. Are there limitations or special considerations when interfacing the extend focus LC with the Agilent Pro IQ plus for real time oligo analysis?
Lee Bertram:
No, there are no limitations or special considerations. The extend is supported in open labs. And seamlessly integrated with the Agilent Pro IQ plus.
Sam Foster:
Here's a good one for me where any adjustments needed to the mobile phase composition or ion pairing reagents to accommodate the reduced flow rates when scaling the capillary LC? No. And that's sort of the point is you can take your previously used analytical conditions and through translating down to the capillary scale, keep all of the, mobile phase compositions completely the same.
No need to change ion pairing reagents. And you're still able to then perform it, but at dramatically reduced fee, vast amounts.
Lee. This is to you. Can you elaborate on how you convoluted the 0.5 Dalton spectral deconvolution? Delta. How is that determined?
Lee Bertram:
Yeah. So, to to calculate the expected mass, we use the average mass of the elements in the compound. And then we compare that to the, the, the convoluted molecular weight that is churned out of our open lab CCDs feature. And that's how we got that 0.5 Dalton, difference between what was expected and what was, experimentally determined.
Sam Foster:
Lee I'll send this one to you again. Is the 103 Mer analysis you presented, representative of what users can expect for other complex, highly modified therapeutic oligos? Or are there additional challenges met with those?
Lee Bertram:
Yeah. As stated in the presentation, you know, the more complex your oligo, the larger of oligo, the harder to deconvolution and the more, difficult the spectra appears to be with the 103 mer. We're just trying to show, you know, what's on the on, you know, the high end of what we of what the capabilities are of a single quad.But as shown by the give author and you can see that with, highly modified garnet conjugated oligo, we're still able to easily and accurately deconvolution.
Sam Foster:
I'll take this one. The question is how adaptable is this method to different column chemistries or mass platforms, especially for labs using nano Agilent instrumentation. Yeah. It it is a, very adaptable method. Sort of. The goal is designed to make the extend plug and play. Is that you can set it up with a low flow capable LC source, and basically run away.
There may be challenges integrating to other, softwares in terms of, different CDS drivers, but those are actively being worked on by Excel.
Lee, I'll send this to you in this capillary Lc-ms method, support quantitative workflows such as pharmacokinetic or stability studies for oligonucleotide drugs like JavaScript.
Lee Bertram:
Yeah, as shown by Sam, there are retention time. Stability of the extend is well within reason and the RMS has a extremely well high mass ion transmission stability. These two can go hand in hand at working with quantitative workflows, without any issues.
Sam Foster:
All right. I'll take this one. In high throughput environments, what are the trade offs in terms of runtime or throughput when adopting capillary LC for oligonucleotide out? So specifically with the extend, because it's a syringe based system in high throughput environments, you have to stop flow and refill those syringes whenever they run dry. And so because of that, you'll actually lose a bit of throughput when you're starting to try and run over and over and over.
That being said, if the goal is considerable sample analysis by swapping to the capillary scale, faster linear velocities can be achieved with much lower flow rates. And so you can run faster with less solvent, which may give you an increase in terms of overall throughput.
All right. I think that's just about all the time we have. If we didn't get to your question, please email it to me or Lee. I'd be more than happy to talk to you. And I'm sure he would say the same. Thank you all again for attending, and I hope it was very interesting.