How to detect significant change / trend in a time

2019-01-30 09:24发布

问题:

So I have an array of say 25 samples and I would want to be able to note the trends of whether it's decreasing n or increasing from those 25 sample time interval(basically 25 samples array is my buffer that is being filled by every say 1 ms).

Note that it is general trend that I am looking for, not the individual derivative(as I would have obtained using finite difference or other numerical differentiation techniques).

Basically I expect my data to be noisy so there might be ups and downs even after doing filtering and so on. But it's the general trend of increasing or decreasing behaviour that I am looking for.

I want to integrate the increasing/decreasing behaviour in every ms to trigger some event which is more of a user interface event (blinking a LED) so it does not have to very delay of processing as long as I can detect the general trend.

Thanks in advance!

回答1:

As has been pointed out already, you're not looking for the derivative. You're really looking for a "significant change" detection algorithm for a time series.

You'll certainly want a smoothing filter (and the moving average filter is fine -- see Bjorn's answer for this part).

But in addition to the smoothing filter, you will also need a decision criteria or threshold selector beyond which you will decide whether or not the filtered changes are significant.

If the underlying statistics of your time series is stable (stationary time series), then you can use a fixed statistical threshold, in the sense of standard deviations from the mean. For example, you might choose 2 standard deviations if you want a fairly strong "alarm" threshold (think alarming only on the strongest 5% of returns).

If there is nothing in the underlying problem that suggests that your time series is stable, i.e. if series could have a trend in it, or the underlying process generating the time series can go through fundmantal changes while you're monitoring it, then you'll need to use a dynamic, or adaptive threshold, in the sense of signal-to-noise (mu/sigma). You might then choose to detect all "meaningful" elements that pass the signal to noise test.



回答2:

It doesn't sound to me like you want the derivative at all. It sounds like you want a low-pass filter. A low-pass filter simply removes the quickly changing data and leaves in it's place the longer, slower-changing trends. The most intuitive low-pass filter is a moving average filter, where you take the average of the last n-inputs, where n is determined based on the noise vs the size of the trend you are looking for. This is widely used from audio data to image processing to unemployment data (the four-week moving average unemployment figure is widely cited).

It's possible develop more efficient/selective filters using recursive techniques, if you feel that's necessary. You can use this tutorial to create a low-pass filter. It's written for audio, but it will work on most any data. It shows you how to write a bell filter, but a low-pass filter is easier.

http://blog.bjornroche.com/2012/08/basic-audio-eqs.html



回答3:

You can use a wiener filter if you know the signal statistics and use it as a n-step ahead predictor. Your decision of trend can then easily be based on the prediction of the wiener filter. If the signal is not wide-sense stationary and you think that prediction cannot be done linearly (a nonlinear/ nonstationary process) then you can use an adaptive wiener filter like the LMS filter.