Ultra Pure vs Feed Water, Comparing the 4 Types of Laboratory Water
There are four levels of water purity recognised in the water purification industry, each of which are used for specific applications in laboratories.
The quality of water is defined through a series of measurements of conductivity (µS/cm) or resistivity MΩ-cm), Total Organic Carbon (TOC) in parts per billion (ppb), and bacterial count (CFU/ml).
Here we explain the four types of water used in laboratories, as well as the processes and properties of each type.
Types of Laboratory Water
1. Feed Water
Feed water is also referred to as raw or potable water, and the quality of it is dependent on its source. Whilst deep water is naturally filtered by layers of rock and soil, water from surface sources like lakes and reservoirs is at risk of environmental contamination.
Raw or feed water is typically identified by measuring its colour, odour and turbidity. You can also look for chemical characteristics like pH and hardness, as well as bacteriological characteristics.
Some of the most significant contaminants of feed water include dissolved ions, minerals, microorganisms and organic compounds.
Feed water should always be tested, and have appropriate pre-treatment measures conducted, to ensure that the quality is sufficient enough to not damage the downstream purification technology.
The most common type of pre-treatment are depth-filters. This process consists of raw water passing through a series of winding fibres, which attract and trap impurities. Carbon is also used to bind chlorine ions, because if they’re not removed, they will cause rapid deterioration of reverse osmosis (RO) membranes.
Another step commonly taken is the installation of a water softener, reducing the hardness level of water before it reaches the RO membrane; hard water causing scaling of the RO membrane and reduces its lifespan.
2. Primary Grade Water (Type 3)
Primary grade pure water (Type 3) uses carbon filtration and RO technology, and is the most cost-effective way to reduce water contaminants.
Removing up to 99% of feed water contaminants, RO sees water flow from a less concentrated solution through a semipermeable membrane to a more concentrated solution. By applying external pressure to the more concentrated side, the osmotic flow is reversed, which forces the water through the membrane, and deposits the impurities on the surface.
RO technology applies diffusion as opposed to separation, rejecting particles with a higher molecular weight. Feed water temperature, pressure, and the physical condition of the RO membrane are all parameters that affect rejection rates. Therefore, whilst rejection rates are variable, they tend to increase as the ionic charge and size of the molecule increases. For this reason, RO water can’t be specifically classified.
RO water is most commonly used at the starting point for many applications in laboratories, including feeding glassware washing machines and autoclaves. It can also be used as a pre-treatment for ultrapure water systems, or anything that’s considered non-critical.
3. General Laboratory Grade Water (Type 2)
Also known as general laboratory grade water, Type 2 water is produced through a combination of reverse osmosis, and an additional technology such as ion exchange or electrical ion exchange (EDI).
Deionisation, or ion exchange, removes ions from RO water by using synthetic resins. Chemical reactions occur as the water passes through the ion exchange beads, resulting in the removal of ions. This process is continued until all unwanted ions are replaced by hydrogen and hydroxyl ions, which form pure water once combined.
EDI is an active purification technology that combines electrodialysis with ion exchange. Water is passed between an anion permeable membrane and a cation permeable membrane within an EDI cell. The cell chamber contains loosely packed ion exchange resin. Ions are then attracted to the oppositely charged electrode, but they’re flushed away before they’re able to reach it, which means they’re removed from the water.
Together, these two processes create Type 2 water, which has a resistivity of 1-15MΩ-cm, making it suitable for general applications such as buffer and media production, general chemistry and spectrophotometry.
4. Ultrapure Water (Type 1)
With a resistivity of 18.2 MΩ-cm at 25°C, ultrapure (Type 1) water is a requirement for analytical laboratories. Flow cytometry, pyrogen sensitive applications, and cell and tissue culture are all typical applications for Type 1 water.
Water with this type of resistivity can still contain organic contaminants, endotoxins and nucleases which don’t impact on resistivity values, so other technologies are required to eliminate them.
Equipment that produces Type 1 water is often referred to as a “polisher”, and can be fed from a localised reverse osmosis system, or a centralised ring main.
Bacteria and organic levels are maintained lowly, through a dual wavelength ultraviolet (UV) light (185nm and 254nm). The water flows through a vessel containing a UV light, and as it passes through, the light damages any genetic molecules that are needed for reproductive functions. This damage prevents the microorganism from multiplying or replicating, which means no infection can occur.
An ultrafilter (UF) can also be used to produce DNase/RNase free water. By using size exclusion, ultrafiltration removes particles and macromolecules, and is sometimes charged to attract contaminants. This technology is employed at the end of the system, to ensure the near-total removal of macromolecular impurities.
Each type of water must go through various processes and technologies to achieve a certain standard of purity. The level of purity then reflects what they can be used for in laboratories. For that reason, it’s important to be able to distinguish between the four types of water, so you can understand how they are utilised in laboratories.
So, you don’t get the results you were expecting, but why not? Could it be the underlying science – or could it be your choice of assay? Here we highlight 6 technology trends – with the potential to improve productivity and cost-effectiveness in drug discovery – if they are implemented correctly!READ MORE
Like what you just read? You can find similar content on the communities below.Analysis & Separations Applied Sciences Biopharma Cancer Research Cell Science Diagnostics Drug Discovery Genomics Research Informatics Proteomics & Metabolomics Neuroscience
To personalize the content you see on Technology Networks homepage, Log In or Subscribe for FreeLOGIN SUBSCRIBE FOR FREE