Go further: Improve industrial wastewater assets beyond the expensive items with advanced analytics applications
In process manufacturing environments, plant operations such as wastewater treatment and neutralization are often taken for granted, but these systems must always be ready to ensure upstream operational readiness. Since these types of ancillary utilities are essential for cost-effective manufacturing, a conservative attitude is often taken when evaluating new technologies.
Changes and upgrades typically introduce impacts to methodologies, chemistry, and setpoints, and anticipating required adjustments is a challenge using outdated analytical tools like spreadsheets.
When subject matter experts (SMEs) attempt to quantify the impacts of these changes or investigate other issues, there is often limited time for analysis, as well as obstacles to gathering all the relevant data. Advanced analytics applications help bridge the gaps between multiple sources, connecting data streams like wastewater treatment with other manufacturing information.
Using data to support decisions in manufacturing environments is becoming more standard, and as industrial wastewater operations seek to modernize, it is imperative to embrace this trend.
The Challenges of Distributed Data Analytics
When analyzing data in industrial environments, it is normal to focus on the most expensive items as a source of possible improvement. In wastewater treatment, this includes assets such as neutralization tanks, settling ponds, digesters and sanitizers. These assets are generally considered “valuable” enough to justify the time and effort required to combine the required data into a spreadsheet, check validity, align, clean and finally perform the analysis of interest to make a decision based on on the data.
Meanwhile, many other assets and processes don’t receive the same scrutiny and are frequently left in “set-it-and-forget-it” mode, as long as they don’t introduce known issues.
Between the sometimes conflicting challenges of meeting industrial process regulatory requirements and adapting to upstream changes in chemistry or process conditions, performing the analyzes necessary for continuous improvement in smaller operations can get lost behind- plan. At best, occasional improvements can be made, but they are usually only based on tribal knowledge or hunches not backed up by analysis.
As industrial wastewater treatment teams seek to modernize their operations, the ability to develop and implement large-scale analytics in a facility, and not just on the largest or most valuable assets, becomes the key. standard. Additionally, collaboration across teams is more important than ever, with the advent of enhanced information sharing and reporting tools, which make it easier to ensure optimal operations and transparency across multiple operating groups and teams. ‘engineering. Greater collaboration capabilities can mean the difference between meeting or failing aggressive enterprise-wide initiatives, such as reducing energy consumption or improving sustainability.
Break down barriers to go further with advanced analytics
Cloud-hosted advanced analytics applications provide solutions to these and other decision problems often present today. These software tools achieve this by first establishing a live connection to all existing data sources in a facility, not just limited to manufacturing operations. This effectively breaks down silos containing the data needed to perform analytics across the entire industrial process, including upstream and inbound process changes.
Using these applications, SMBs can quickly create calculations with simple point-and-click tools to clean, contextualize, and model processes to perform diagnostic, predictive, and descriptive analytics on operational and equipment data. With these tools, the engineering hours required for analysis can be drastically reduced, eliminating large amounts of time wasted with manual data manipulation and sorting in spreadsheets. This frees qualified personnel for other tasks, such as process optimization and improvement. And it can expand the scope of scans beyond a site’s largest and most important assets, bringing the rewards of multiple distributed plant-wide improvements.
Additionally, with the ability to ingest or create a hierarchy of assets in the data that reflects plant equipment, analyzes that have been performed on a single asset can be easily replicated at scale across the entire fleet, allowing easy monitoring of many assets that might have been overlooked in the past due to the perceived return on time investment. These results can then be captured and communicated across the organization. This mitigates the siled, time-consuming, and error-prone analyzes that occur when organizations rely on spreadsheet applications for data analysis.
Optimize a chemical dosing application
The process of neutralizing, flocculating, softening, or restoring process waste to the quality required for disposal or reuse in the facility typically requires large amounts of chemicals for treatment. Quantifying the exact volume of chemicals required for treatment – whether pH neutralization, bioburden disinfection, or some other process – has historically been difficult to determine on an ongoing basis. As a result, many industrial facilities end up over-treating their wastewater, using more chemicals and energy than necessary.
By implementing Seeq, an advanced analytics application, a dairy-based food and beverage manufacturer was empowered to perform analytics to compose accurate chemical dosage calculations, with newly found time thanks to the reduction of manual data manipulation. Leveraging a live connection to process data, the plant’s process engineer analyzed historical data, comparing the bioload in the water waste-activated sludge with the amount of products chemicals used for treatment.
These chemical quantities tended to be based on a single worst-case scenario, rather than actual contamination conditions. By creating a conversion table of the chemicals needed to treat each given bioburden, the engineer analytically quantified the amount of chemicals that could have been saved using a dynamic control strategy rather than the worst-case dosage. Coincidentally, the old setpoint did not always treat wastewater adequately (Figure 1).
During this investigation, the engineer determined a dramatic average overtreatment of the wastewater. By adjusting the control strategy for variable dose based on process levels, the facility has reduced chemical costs, saving hundreds of thousands of dollars each year.
Explore and quantify automatic control improvements
In addition to developing control strategies, making improvements to process logic and the control loops themselves can unlock hidden goldmines of savings. Control systems in wastewater treatment processes are typically designed and implemented during facility construction, and they are rarely revisited during upgrades due to the data intensive nature of analyzes to characterize each control loop.
In the wastewater treatment plant of a large pharmaceutical manufacturer, engineers sought to study the effectiveness of multiple control loops throughout the facility. Stuck with existing disparate databases and spreadsheet methods, this effort would have been time-consuming, even with limited scope. Using Seeq, the engineering team quickly connected, cleaned, and contextualized data from their various control loops. Next, they quantified the metrics of these loops, such as time spent in a percentage of setpoint, travel from setpoint, time to reach setpoint and more (Figure 2).
Leveraging Seeq’s ability to create ad hoc asset structures, the group combined lab and manufacturing data into a single asset structure focused only on the control loops of interest. Then they extended the analysis from one loop to several others within the installation. As a result, the engineering group identified opportunities for improvement by labeling the least efficient loops, along with ways to adjust the loop or replace equipment accordingly. This has resulted in an average 50% reduction in the time it takes to reach setpoints across multiple plant-wide process processes.
Staying competitive in today’s fast-paced manufacturing environment requires operational agility and plant optimization at every stage of a process. Left to the tools of yesterday, it is difficult for process manufacturers to properly analyze, operate and manage industrial assets and processes, but today’s modern software tools make these tasks much more bearable.
Advanced analytics applications connect data between previously disparate sources, quickly generate insights, and promote information sharing among multiple teams within an industrial organization, helping to optimize processes and smooth collaboration between facility personnel. By leveraging these tools, SMBs can spend less time worrying about managing and contextualizing data, and more time reviewing insights with team members to improve performance, sustainability and plant-wide profitability.
Sean Tropsa began his career as an engineer in the specialty manufacturing and semiconductor manufacturing industries. Here he learned how to analyze large datasets, with a focus on root cause analysis and continuous process improvement to solve a variety of problems. Leveraging this experience in his role as Senior Analytics Engineer at Seeq, Sean helps companies improve their analytics, enabling engineers to gain actionable insights from their data. He holds bachelor’s and master’s degrees in chemical engineering from Arizona State University.