The Issues with Analysing Water
Article Mar 03, 2017 | Louise Saul, Science Writer for Technology Networks
Contamination of water with pharmaceuticals has become an increasing world-wide problem. With more drugs constantly been added to the market, there is a constant flow of contamination. Chemicals such as Perfluorooctanesulfonic (PFOS) acid have a long half-life, and have caused such a toxicity to the environment that companies were forced to consider alternative choices. Limited detection methods also mean that not all problem compounds are getting detected, causing further issues.
We spoke to Neil Donovan from i2 Analytical about the problem that scientists face when analyzing the samples, how the technology and instrumentation has changed over the years, and how this had affected the industry.
LS: What do you think the main problem is that scientists are facing when analysing contaminated water samples?
Neil Donovan (ND): There are multiple issues. This can be split into environmental and analytical chemistry issues. There are a lot of biological active compounds constantly being discharged into the environment. Some compounds such as ibruprofen are not too bad due to their short half-life, but things like Perfluorooctanesulfonic acid (PFOS) are a big problem. There has been a big push to stop the manufacturing of things like PFOS so no more gets pushed into the environment. With pharmaceuticals, it is the constant discharge due to the volume companies use that is the issue, that regulation is yet to address.
Analytically there is such a diverse range of compounds licenced for use, and they all have very different chemistries. As an analytical chemist you’re having to cover such a wide range of molecular weights, polarities, solubilities and other properties that make the analysis very complex. The Chemicals Investigation Programme (CIP) helps with this. CIP is a nationwide, two-phase, study that is the first stage of getting an idea of what is happening to the UK water-ways.
LS: You were involved with the development of novel techniques following the Buncefield Oil Disaster in 2005. Could you explain these techniques, and where things have developed from there?
ND: The Buncefield Oil disaster happened in December 2005, when the plant went up in flames. Assistance was needed to try and determine the spread of PFOS contamination. Most of the samples on site were water samples that had been contaminated, or poolings of run offs from where the firefighters were extinguishing the flames. The problem at the time was that PFOS was not amenable to standard liquid chromatography techniques (e.g. LC-UV) or gas chromatography mass spectrometry (GC-MS). A new method was required, this being LC-MS.
One method developed was a direct injection method, and another was a solid phase extraction (SPE) method that cleans up the sample prior to analysis. There was a big issue with ion suppression, which we eventually solved, and got the method up and running and accredited to UKAS ISO-17025 within 3 months.
We found that it was not just PFOS that was the issue, but multiple compounds that were related to PFOS. With liquid chromatography single quadrupole mass spectrometry it was very difficult to identify the other compounds, so a larger screening method was needed. The larger screening method that was used then is still used now. It lead to the development of the single quad to triple quad instruments, and there are now roughly 40 compounds that we screen as derivatives of PFOS. We now have a much more accurate picture of what happens with the contamination of these fluorinated compounds.
Single quad is still a great tool for initial method development, as you can iron issues out like the chromatography and ionisation techniques before transferring over to a triple quadrupole (triple quad). The triple quad gives you that extra level of sensitivity, but single quad is a great method development tool.
LS: What future developments can you see happening in your lab, and how do they help solve the current problems analysts are facing?
ND: The equipment we use is Agilent equipment. The differences in sensitivity between models has increased roughly 30 times for certain compounds. The sensitivity and limits of detection (LOD) that were needed for analysis were not available a few years ago. During CIP phase one, companies were struggling to reach LOD, but nowadays we can see so much lower and can see low levels confidently on a day to day basis with the new instrumentation available.
Future developments will focus around accurate mass, time-of-flight (TOF) instrumentation, and the associated software that will allow us to undertake unknown screening. Everything we currently look for is on a targeted list with achievable LODs. The percentage of what we can analyse compared to what is out there is miniscule. Analysis needs to move from targeted analysis to screening analysis to get a better-defined understanding of what samples contain.
There is an understanding now that targeted analysis is good for legislation, but it does miss an awful lot. QTOF technology is very expensive, but as the technology improves more and more people are using it, and the price should reduce. I think it will become more common place to carry out screening analysis as a result.
Neil Donovan, Environmental Forensics Manager at i2 Analytical, was speaking to Louise Saul, Science Writer for Technology Networks
The War on Plastics: Is It Too Little Too Late?Article
With UK Prime Minister Theresa May announcing today that UK supermarkets must have a plastic free aisle as part of her government’s 25-year environmental improvement plan, it is the latest in a line of reforms seeking to reduce the harmful plastic contamination that has become a worldwide scourge.READ MORE
Practical Guidance for the Confident Application of Differential Scanning CalorimetryArticle
In this article, we provide practical guidance for the application of differential scanning calorimetry (DSC), a core technique for studying the stability and higher order structure (HOS) of proteins. We explain how DSC works, and highlight good practice for ensuring optimal data quality. The interpretation and value of the resulting data is also discussed.READ MORE