Quality Assurance and Sustainability in Lithium-Ion Battery Production
Complete the form below to unlock access to ALL audio articles.
Renewable and green energies are increasingly being implemented worldwide, owing to government initiatives aimed at reducing greenhouse gas emissions and slowing the impact of climate change. The upcoming restrictions on internal combustion engines for automotive applications are also driving research and development into more effective ways of storing energy and limiting environmental pollution, and lithium-ion battery technologies are at the forefront of these efforts. However, there is a difficult balance for manufacturers to strike between scaling up production to meet growing demand and maintaining the high quality expected by consumers. The market requires ever more energy-dense, lightweight and fast-charging batteries that can be quickly and affordably produced in bulk. Even very small irregularities that appear early in the production process can significantly impact the functionality and safety of the final product.
Image 1: Some of the key applications for lithium-ion batteries.*
It is therefore critical that defects in lithium-ion battery components are reliably detected as soon as possible through continuous process monitoring, to ensure optimal performance and safety levels. Early defect identification also reduces raw material waste and minimizes the risk of costly production downtime during repairs or quality control failure investigations, which can eat away at a company’s profits. Non-uniformity can occur in the thickness, outer coating, cathode and anode layers, and separator film, while chips and flaking materials are also potential hazards. Defects in any battery component can have serious implications for safety, posing a fire risk in addition to jeopardizing the overall performance of the completed cell.
Finding the ideal measurement solution for your application
Fortunately, there are instruments available to provide analysis throughout the manufacturing process, allowing issues to be identified early on and rapidly traced back to their source. These systems can accurately and reliably determine aspects such as the chemical composition and structure of the anode, cathode and separator film, and the viscosity of the coating material. This ensures that defective or out-of-specification material is segregated before it is incorporated into the final product.
Image 2: In-line metrology systems used in the separator film manufacturing process to identify defective material.*
The manufacturing stage, the materials used, and the geometry of the final cell all determine the measurement equipment that should be used. For example, within the electrode coating process, the possibilities for in-line measurement are diverse. However, based on the absorption rate and basis weight of the target materials, beta ray technology tends to be the most suitable for this application, as it can differentiate between the substrate and the coating with high precision and resolution. Additionally, the separator film is a low-density polymer, meaning that most radiation will go through it very quickly, rendering even low energy X-ray measurement solutions heavily influenced by environmental factors. Infrared or beta technology is therefore preferable for measuring separator films, in order to have the required dynamic resolution. In-line metrology uses sensors based on these technologies to measure thickness and uniformity, ensuring the film will perform its critical role within the final cell.
Image 3: The end-to-end electrode coating process monitored and controlled at all stages.*
Optimizing your in-line measurement system to produce the best results
Once you have chosen an in-line measurement system, it is important to calibrate it correctly and make sure that it is set up for your application. For instance, the faster the scan speed, the more material you can measure, but potentially at the cost of reduced accuracy. This occurs when the sensor response cannot keep pace with the scanning speed and as a result produces blurred images of edges and defects. In this case, it is necessary to slow the scanner down to match the sensor in order to generate a clearer image.
Another aspect to consider is the shape of the source; the beam should be narrow, since a wider beam can produce blurred loading data at the edge of the image. A line source will help to achieve higher resolution and can visualize edge defects, streaks, wrinkles and scratches present on the surface. Additionally, tension variations and misalignment in the scanner’s frame can frequently cause misalignment between source and detector resulting in measurement errors. It is therefore essential to minimize the impact of flutter, and ensure that the scanner rails are always parallel prior to use. The thermal expansion of the frame can also affect results, so it is particularly important to incorporate a thermally stable frame with temperature compensation and algorithms that account for any variations in the frame design.
No matter how robust a system is, or how well it has been calibrated, it will require service and maintenance to stay in top condition and prevent technical issues. Many in-line measurement systems are now capable of remote monitoring and performing continuous self-diagnostics, sending notifications to the off-site technical team before the operator even realizes there’s a concern. This pre-emptive maintenance reduces production downtime and gives customers peace of mind.
The effects of global climate change can be lessened through enhanced renewable energy storage, and more efficient and widespread manufacture and recycling of lithium-ion batteries in particular could go a long way in supporting this goal. Real-time, in-line measurement systems help manufacturers to maintain the quality and safety of their lithium-ion batteries, while maximizing productivity and process efficiency, making these versatile products more widely available for a greater range of applications.
* All images courtesy of Thermo Fisher Scientific.
About the author
Chris Burnett studied Physics at Worcester Polytechnic Institute, MA/USA, and has held various positions at Thermo Fisher Scientific, including Director of Sensor Development and Manager of Systems Integration and Systems Production in the company's Flat Sheet Gauging business unit. With more than 25 years of experience, Chris currently helps drive solutions for the battery industry in his current role as Senior Field Marketing Manager.