Saturday, October 8, 2016

Reflections on WEFTEC 2016

I've just returned from the latest WEFTEC held in New Orleans. It was a great opportunity for me to catch up with colleagues and friends that I've gotten to know over the years. I went to my first WEFTEC in 1999 and except for one year when I was stuck on assignment in Singapore I'm pretty sure I've been to every WEFTEC since then. So I could blog about catching up with old friends, but instead I thought it would be interesting to reflect on what was NEW for me this year.  So here goes...

New Perspectives

Joe Whitworth
The opening session included an interesting presentation from Joe Whitworth of The Freshwater Trust.  It was refreshing to see someone who identifies themselves as an environmentalist wanting to work collaboratively rather than the typical obstructionist approach of well-meaning but naive and short-sighted groups. The majority of the people I work with want to protect and improve the environment and so you'd think that the likes of the Sierra Club and other environmental NGO's would be our advocates in improving treatment, but my own observation is that they're sue-happy and prevent real progress by pushing everything into the interminably slow US legal system.  When I was a kid doing environmental studies, the US EPA was held up as a great example of how to improve the environment through science-based investigation and laws.  Unfortunately the good things started in the 1980's now seem to be stagnating in the courts where the "process" is more important than actually doing what's best for the environment.

My good friend and visionary in the field of water, Dr Sudhir Murthy gave an excellent presentation on research and innovation for water utilities, though I have to say that Bernhard Wett stole the show in his introduction to Sudhir by describing him as "non-linear"! Not sure why that tickled me so much, but maybe it's because he's such an intriguing and interesting fella that's impossible to label him. Nice one Bernie!

Poop-talk radio!

New Experience

Thanks to WaterOnline.com I had the opportunity to do a radio interview on the topic of Big Data. I wasn't quite sure what to expect going in to the interview but it actually good fun. Big Data, the Internet of Things, Smart Utilities and the like, are hot topics that are gaining a lot of interest with a whole bunch of people. See my previous blog for more info...

New Responsibilities

This year I took over the reins of the Municipal Resource Recovery Design Committee (MRRDC).  This is one of the largest committees of WEF and is tasked with upholding the quality of design for treatment facilities. Amongst many things, we are heavy involved in the iconic "Manual of Practice #8" or MOP8, which is used extensively in North America for treatment plant design. We also help develop other technical documents and workshops for WEFTEC.  My predecessor, Dr Art Umble, did an excellent job as the previous chair, ending up being recognized as a Fellow of WEF.  I jokingly said that my main job was to not screw up the good things he started! Hopefully I'll do a little better than that.

New Technologies

No self-respecting poop engineer could blog without mentioning the cool technologies in the exhibit hall. Aerobic Granular Sludge (AGS) is causing quite a buzz in the poop-treatment industry and my firm is in the mix teaming with Royal HaskoningDHV, the developers of the Nereda process. Another technology causing a stir is GE's new "Zeelung" membrane aerated biofilm reactor (MABR) process which, on paper, can give up to 100% oxygen transfer efficiency (OTE) and in practice is giving at least 50%.  This is a game-changing technology in my book as it at least doubles the energy efficiency. Watch this space for more (or check out the video below)!

New Cullinary Delights!

Oh boy, if you're ever in New Orleans, you must try the Trinity restaurant. In a city known for good food, this has to be one that's near the top of the pile.  (Which reminds me, I must go to TripAdvisor and leave a review - 6 stars out of 5, I'm thinking). We had a dinner with a client group on Monday evening at which I had a crab-filled beignet for starter, a fabulous pork shank for main and possibly the best desert I've ever had to finish (some kind of macadamia nut desert with a subtle caramel sauce and mango - I can't even begin to describe it adequately!). On Wednesday night I was invited out by the good folks at S::CAN for a dinner and to my delight, they picked the same restaurant!  This time I was able to try their lamb chops which I swear had chocolate in the sauce (some kind of molĂ© perhaps?) - it was equally awesome!

So another WEFTEC is done.  I think it was one of the best for me personally and hopefully for the 20,000 others who attended.  Let's see if WEFTEC 2017 in Chicago can top it!  

Friday, September 9, 2016

Get M.A.D. to be smart!

In a couple of weeks I'm heading off to WEFTEC, the major annual North American conference for poop engineers like myself. This year I'll be participating in a breakfast meeting where we'll discuss my firm's offerings for smart analytics which sits under the umbrella of our Smart Integrated Infrastructure group.

For the breakfast meeting I persuaded my colleagues to take their lead from a recent book produced by Pernille Ingildsen and Gustaf Olsson, called "Smart Utilities: Complexity Made Simple". Their excellent book uses a very simple but useful mnemonic to help us get our minds our the most important aspects of instrumentation and control in the water industry.  Quite simply it is M.A.D.!

M is for Measure

Firstly, and perhaps most importantly we need to focus on the measurements.  What do we need to measure (and where)?  How do we measure it? And what confidence do we have in that measurement?  It might seem simple to say we need to be measuring the right parameters in the right place but in my experience this is the place where we've fallen down over the years, particularly when it comes to measurements in the extremely fouling environment of wastewater.  All too often people have taken standard environmental monitors for water, streams or rivers or - worse still - lab instruments, and plonked them into poop water hoping they'll work.  Unfortunately wastewater is not forgiving.  In the late 1980's and early 1990's my old boss Dr John Watts did some great work in looking at the need for good calibration and validation in order to trust your data.  Unfortunately he didn't publish much internationally, but here is one paper on the topic. More recently Oliver Grievson in his review of activated sludge instrumentation to celebrate 100 years of AS gave a nod to John's work in producing an online respirometer (and I'm still waiting for someone to produce something nearly as good - maybe the ASP-CON?)

A is for Analyze

The focus of a lot of the buzz right now is on "smart analytics" and the ability of software developed in the Internet age to handle "big data." That's all pretty cool.  I always joke about wastewater having a problem with "crap data" rather than big data, but assuming we can figure out the "M" of measure, then there are now plenty of sophisticated tools to help us manage and analyze our data. All the big guys in IT are getting into this space, including the now famous Watson at IBM and now Microsoft with their PowerBI (I need to find time to play with that sometime as it looks pretty cool).



D is for Decide

OK, we have lovely measurements, producing pretty graphical representations of our big data... now what?  This will be the fun part. Right now, most systems I've seen, leave the decide step to the operator or plant engineer.  They have the expert knowledge which, coupled with insights from the advanced data analytics, are a powerful combination to help optimize a plant.  The step beyond this is to add in automated control actions based on the input from the smart analytics.  This is sort of like the jump to an autonomous vehicle which makes many people nervous but ultimately will give us the best performance overall.


Further Reading

I heartily recommend Pernille and Gustaf's book for anyone considering smart analytics.  As an introduction to it, you might also read Pernille's blog, "Why are "Smart Water Utilities" not already here?"

So, if you want to be smart... get MAD!


Wednesday, March 9, 2016

Model-ing Citizens

At the end of this month I'll be heading off to Europe for the latest wastewater process modeling conference - wwtmod2016 - which I'm sure will be as interesting as previous seminars in the series.  It's a pretty cool gathering of process engineers and/or modelers to dig into the process models that are a big part of what I do as a poop engineer.

In thinking about the seminar, I started to reflect on the fact that there are a lot of good wastewater process simulators on the market right now, 4 of which I'll talk about in this blog.  Before I get to them though, I have to comment on the fact that there are so many simulators in what is a pretty niche market.  If I think back to spreadsheets in the 1980s' there were several on the market - Lotus 123, Excel and Quattro Pro (the last being my fave for some time), but today the market is dominated by Excel alone (OK, there are a few diehards using OpenOffice, Numbers on the Mac or Google Sheets for simple collaborative tasks).  So how come there are so many good wastewater process simulators on the market right now? I'd like to suggest a few possible reasons:
  1. Wastewater treatment is complex (and fascinating!) and so we need models to help us figure it out.
  2. Wastewater modeling has strong "champions" who have really driven the ideas and industry in a positive direction.  In another blog I might list out several of these individuals, but from my own personal experience and because he's turning 60 soon I'll just mention Imre Takacs here.  Super nice fella with a real passion for modeling, including his latest venture with SUMO.
  3. Wastewater modeling is cool.  Let's face it, producing diurnal graphs and "playing" with a virtual plant is way cool.  That's why I like it, right?

Simulators

OK, so here is a list of my personal favorite simulators.  They each have their particular strengths but I resist the temptation to say which is "best" despite being asked many times.  They're all good and useful tools.  Some do some things better than others and it's always a moving target as they each add new features. There are other simulators on the market too, but my exposure to them is limited, so the fact that I've not listed them is in no way a slight on their capabilities.  So here are, in order of my exposure to them and not in order of preference, my fave four!

GPS-X

For me, it all started with GPS-X 2.0 running on Unix on a HP computer.  That's real modeling!  They and everyone else shifted over to Windows which was more convenient, for sure, but our models slowed considerably until recent years as a result.  I still wonder if GPS-X running on modern machine with Linux wouldn't be the way to go!

The big plus for GPS-X is the user interface. Maybe it's because I started with this simulator, but I still love the ability to use sliders for control, set up graphs and have scenarios all in one interface.  It has it's quirks you need to learn like all the simulators, but in terms of being easy to run and adjust models, it's great.


BioWin

So, I find GPS-X one of the easiest simulators to run, but BioWin has deservedly won the reputation amongst design engineers for being the easiest to set up.  There are many engineer-friendly features in the model that make it the goto for many process engineers.  Another strength of the model is the biokinetic model (ASDM), which has been termed a "super model", because it carries all variables and all rate equations around all process units which makes it easier to ensure the mass balance holds.  They also pride themselves on having decent defaults for most parameters under most conditions. You should never use any model "out of the box" without knowing what you're doing but with BioWin you maybe don't have to move too far out of the box!


SUMO 

Here's the new kid on the block.  Having done his time with Hydromantis (producers of GPS-X) and Envirosim (BioWin folks), Imre struck out on his own to develop a whole new modeling platform from the bottom up.  It's pretty exciting to see the development.  What I've seen so far there are two main tenets guiding it's development: (1) modern user interface (pretty cool); and (2) ease of access to the biokinetic models.  The latter is the most exciting piece for me as it's allowing us to do a lot of investigations for WERF projects and other applications.  For those at WWTmod2016, you'll get to see some of this when my colleague Patrick Dunlap presents some initial modeling for one of those projects.


SIMBA#

Lastly but only in my own chronology as it's been around for years in Europe, is SIMBA#.  This simulator is very strong for anyone wanting to look at control.  It also has some nice energy features which they're continuing to develop and refine.  I'm thinking this "old kid on the block" may start to get some traction in North America in the next year or so.



So... what's your favorite wastewater process simulator (assuming you have one), and why?

(Copyright disclaimer... all graphics on this blog were taken from the software supplier's websites.  Please check out their sites for the original graphics and further info)

Monday, January 4, 2016

Bad data versus big data (or big bad data!!)

There's currently quite a buzz about "big data" and how water utilities might dig into all the data they collect in order to be "smarter." Several of my colleagues are investigating ways to do this under the banner of Smart Integrated Infrastructure (SII) and Smart Water Analytics. Pretty cool stuff. In a couple of conversations on the topic I half-jokingly said that wastewater doesn't have big data, it has crap data!  To avoid misunderstanding, I should clarify that by "crap" I'm referring to it being bad data and not just data describing the fecal material we treat!

Over the years I've been involved with various projects and discussions on generating and handling data in wastewater treatment. A few years ago I was involved in a couple of WERF Projects focused on developing Decision Support Systems (DSS) to prevent plant upsets, along with Dr Nancy Love and Advanced Data Mining (ADMi). The folks at ADMi did some nice data analytics to pick out anomalies that might indicate toxins in the plant influent, but one of the major hurdles we ran into was distinguishing anomalies due to toxins and anomalies due to measurement problems. This reminded me of what my ex-boss and mentor, Dr John Watts, used to drill into me which is you need to focus on good primary measurements in order to have confidence in your data. Wastewater is a tough place to try to do that! As I said, a lot of our data is bad.

So, here is my brain dump on some of the keys to making big data work in wastewater, and avoiding the pitfalls of bad big data (there's a tongue-twister there somewhere...)!

5 keys to making big data work


1. Focus on data quality rather than quantity

Starting from Dr Watt's sage advice to me years ago, and written up in one of his rare papers here, no amount of fancy analytics can overcome measurement errors, whether that's noise, drift or interferences.  You need to have confident in your primary sensors and analyzers otherwise your big data analytics will be crunching numbers that are meaningless and therefore any results you'll get will be useless.  Crap data = crap analytics!

In order to gain confidence in your data, you need to do 3 things with your sensors/analyzers:
  1. Clean them - wastewater is an extremely fouling environment an not the best place to put scientific equipment.  My experience has been that everyone underestimates how quickly sensors become fouled.  Go for auto-cleaning whenever possible and avoid installing anything in raw sewage or primary effluent unless you really need the measurement (see Key #2!) as these areas are particularly prone to fouling. Mixed liquor is actually an easier place to take measurements and final effluent the easiest of all!
  2. Calibrate them - this is generally understood, though the frequency of calibration, particularly for sensors that tend to drift, is generally shorter than ideal.
  3. Validate them - this is the piece that's overlooked by most instrumentation suppliers, I think. Analytics to validate the measurements, particularly during calibration is an area that needs much more attention.
Much of the work that Dr Watts did at Minworth Systems was focused on automating these 3 things and I've seen very few instruments come close to what he did 20 years ago!

2. Measure what matters most

I could probably make this blog an ode to John Watts and fill it with his anecdotes.  One of my favorites was one where a customer asked him to install a dissolved oxygen (DO) probe in an anoxic zone. He suggested it would be cheaper to install a wooden probe and write 0 mg/L on a fake display!  Maybe that's a little harsh, but the point is that we should only measure things that are useful to help us to run the plant and that we're actually going to use to make some decision. Generally we're lacking many important and basic measurements in our treatment plants (e.g. dissolved oxygen in the aerated basins, airflow to each aeration zone and electricity use by blowers), but we need to be careful in our enthusiasm not to swing to the other extreme and start measuring stuff that's interesting but not useful. You can spend some serious money measuring ammonia and nitrate all over a treatment plant, but unless you're actually using it for control, the measurements will eventually be ignored and the instrument neglected.  It's much better to have a handful of good instruments, positioned in locations where you're actually measuring something you can control, then there's motivation to keep those sensors running well (see Key#1!)

3. Think dynamics, not steady state

A lot of the design and operational guidance in text books and training materials have simple equations into which you plug a single number to get your answer (e.g. sludge age calculation or removal efficiency). Similarly, influent and effluent samples are usually flow-weighted or time-averaged composites (worse-still, grab samples!).  All this means that we're used to thinking and talking about average daily conditions.

Graphic showing difference between composite
 sample and continuous measurement
(Courtesy Dr. Leiv Rieger/WEF,
taken from WEF Modeling 101 Webcast)
However, the reality is that our treatment plants see significant daily variations in flows and concentrations and therefore we need to look at them as a dynamic system. This was first brought home to me when I was working on a plant in the UK doing biological phosphorus removal back in the late 1990's. We had an online phosphate analyzer taking measurements at the end of the aeration basin just prior to the clarifiers and we would see daily phosphate peaks of 1 or 2 mg/L every afternoon for just an hour or so, but the effluent composite sample measurements would be pretty consistently below 0.2 mg/L. To understand our wastewater treatment systems we need to measure their dynamics and then analyze that good data (having adhered to Keys #1 and #2, of course!!)  

4. Recognize different timescales

Hand-in-hand with dynamics is the need to think about different timescales:

  • Diurnal (daily) variations
  • Weekly trends (especially weekend versus weekday differences)
  • Seasonal shifts
For each of these, the data analytics needs are quite different and need to be thought through properly. For diurnal variations, it's useful to compare one day to the next by maybe overlaying the dynamic data; for weekly trends we can do something similar over a 7-day horizon; for seasonal shifts we need to plot out long-term trends and compare them to temperature and maybe rainfall shifts.

5. Consider how to handle outliers and extraordinary events

This blog is getting long, so I'll try to wrap up this 5th key quickly!  In data analytics it's common practice to identify and eliminate outliers, assuming they're either "bad" measurements or not typical and therefore we can ignore them.  However, thinking back to my involvement in the WERF projects on DSS, a lot of what is done at wastewater treatment plants is trying to keep the process stable in response to abnormal events such as upsets from shock loads or toxins, or more typically responding to wet weather.  This means we need to identify these "outliers" but rather than throw them away, we need to decide how to respond. Maybe this is a topic for another blog?!!