Sign in to follow this  
Followers 0
MacSparky

Analog filter code

12 posts in this topic

Hi I'm looking for some help on a bit of code for a PLC 5 analogue input. The application will be a laser distance sensor measuring the amount of product in a bunker as per picture. The picture is very basic, but gives the general idea of the application. The only problem that I can see would be some stray product floating about the bunker occasionally breaking the lasers beam. Is there a filter that would average the input over a set number of scans, we used a similar system in my last place, but it was Siemens and I don't have access to the code. Any help would be appreciated. Cheers, Colin

Share this post


Link to post
Share on other sites
The PLC 5 has a "AVE" instruction (see PDF below) or just a simple analog filter to filter the noise ( see TXT below) Average .pdf Filtering.txt Edited by Mickey

Share this post


Link to post
Share on other sites
You don't want to average each reading individually. One floating piece will ruin your average if it is given equal weight as a good reading. My first instinct is to take a few readings, say 4, and find the maximum reading and then use that to average. Hopefully the floating bits will not be able to break the laser beam 4 times in a row. If so you may need to use more readings before finding a max. A more sophisticated method would ignore readings that are too much different from the distance to the mass being measured. I would take into account the speed of the conveyor. This the trick we use when measuring distances with MDT rods. From the previous readings we predict a min and max for the next reading and the next reading must be within that min and max. We would take into account the normal rate of change in the readings due to motion. You may need to use the first trick to find the initial position and speed before using the second trick. How fast it the mass moving? How often do you need an update? What kind of accuracy is required.

Share this post


Link to post
Share on other sites
Hi Peter, It is a slow moving process so an update time of maybe 10-20 seconds might work. As for accuracy, it doesn't need to be pin point. Id settle for consistency! Thanks for the ideas so far, but I'm afraid writing code for it would be beyond my capabilities. Cheers, Colin

Share this post


Link to post
Share on other sites
If you can sort 10 readings and use the 8th or 9th highest that would probably be more than good enough. The floating pieces shouldn't cause 8 or 9 low readings.

Share this post


Link to post
Share on other sites
If the OP goes that route the PLC5 also has a nice sort instruction (see PDF below). Sort File .pdf

Share this post


Link to post
Share on other sites
Based on the speed of the process as described, you may want to use a simple single pole IIR filter. These are far simpler to implement than all the previous suggestions and easily tune out noise. Think of it as a simple low pass filter because that's exactly what it is. Big "spikes" get ignored. Code: Filtered_reading = (Filtered_reading)*alpha + (New_reading)*(1-alpha) Alpha is a tunable parameter. If you know the sample rate, then you can relate it directly to the filter pole frequency but usually you don't need that much sophistication. Simply plug in an appropriate value. For instance, alpha=0.9 means it relies 90% on "history" plus 10% on the "new value". Flipping it to 10% gets you the opposite (and is pretty useless). Higher values (closer to 1) give you more and more filtering (lower frequency pole). You can also feed the output of one filter into the input of another which results in a 2 pole filter and even stronger dropoffs. This is just about the most basic IIR filter and google searches will give you a lot more information on these. It's also called an exponentially weighted moving average filter. All the other filters that have been described are FIR filters (linear or nonlinear). FIR means "finite impulse response" which means it relies on a finite number of past history values. IIR means "infinite impulse response" which means it relies (theoretically) on an infinite number of samples. IIR filters are the discrete implementation of analog filters while FIR filters have no analog world equivalent (except switched capacitor filters which are interesting but far more complicated than you'd ever see in the real world except for SAW filters).
1 person likes this

Share this post


Link to post
Share on other sites
Thanks for all the advice, I'm sure I'll have fun trying out your suggestions! Here is hoping I don't crash the thing!

Share this post


Link to post
Share on other sites
NO they won't!!! It depends on where the floating stuff breaks the laser beam. What do you do when the conveyors actual position is at 100 and you average in floating stuff at 10 inches? Sorting to remove the floating fluff readings is the way to go. The floating fluff readings must be not averaged in.

Share this post


Link to post
Share on other sites
Here is one way Peter's suggestion could be implemented. It is fairly simple and is just six rungs long. A timer triggers storing a data sample every half second. Then after ten samples are stored (5 seconds) the data is sorted, then averaged using only the five lowest values and ignoring the five highest values. If you actually want to average the five highest values instead of the five lowest then change the AVE instruction to start at #N11:6 instead of #N11:1. BUNKER.zip Edited by Alaric

Share this post


Link to post
Share on other sites
A faster solution is to use the median function which is known to be a robust statistic. Simply delete the "AVE" instruction and just take the value at N11:6 or since you are doing exactly ten values, (N11:5+N11:6)*0.5 (average the middle two). I commonly use the median when I have multiple thermocouples because just slightly less than half the thermocouples have to fail before it stops producing correct values. Yes, averaging (such as with an IIR filter) does indeed "average in" bad values. But depending on the tuning parameter, the effect may be very small (almost nonexistant). It depends on how much "impulse noise" you are actually getting which is causing the issue. If alpha is set to 1% and we are sampling at say 200 ms, then although indeed the "fluff" does have an effect, if it's just one sample over a 20 second period, the effect will be less than 1% (corresponding to a full scale error). If it's multiple samples and/or the error effect is fairly large, it will indeed cause quite a problem. Ultimately, using a robust statistic like the median is a simple way to avoid the problem altogether but does require additional data (memory) and computing time. You can also often construct standard deviations (with again an IIR filter) using just the differences. Then looking at the absolute value of this value compared to the difference between the current and previous values gives you an indication on whether or not to reject the value. The downside is that although there is a robust statistic for standard deviations (use the median of the variances rather than the average), working on differences tends to be more stable than the values themselves and provides a "self-adjustment" function.

Share this post


Link to post
Share on other sites
After reading Paul's response, and having a little more time to think it over, I've got a simpler four rung implementation that does not use indirect addressing. Hopefully it is a little easier for the novice programmer to understand without the indirect addressing. It uses two queues, one ten element queue to select likely good readings from the laser, and one four element queue to compute a running average of the likely good readings over a twenty second period. The first rung provides a single scan timing pulse every 1/2 second.In the second rung the FFL instruction is used to load a ten element stack with new data every 1/2 second. (A FAL would have worked just as well here)On the third rung, once the stack is full (five seconds) then it is sorted in ascending order.On the fourth rung an element is selected from the sorted stack that is likely to be a good reading. I chose the second lowest for the example but any statistically significant element could be chosen. Choose the one that works best for your process. The selected element is stored into a four element averaging stack to provide a running average over 20 seconds. The second stack also uses the FFL instruction, coupled with the FFU instruction, to implement a first-in-first-out (FIFO) queue of the readings to compute the running average. The twenty second running average will update every five seconds. Finally the first stack is reset so that it will start from the beginning and load ten new data points. Note that upon start up the running average queue won't read right until the averaging FIFO queue is filled -it may be necessary to pre-initialize the queue or program the controls to not respond to the average until after the queue is filled. BUNKER.zip Edited by Alaric

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0