当前位置:网站首页>Particle filter learning record

Particle filter learning record

2022-06-12 15:02:00 From spring to winter

Particle filter learning record

1. survey

Particle filter is a nonparametric implementation of Bayesian filter . Its starting point is to use a series of particles sampled from the posterior probability distribution to represent the posterior probability . The advantage is that it can represent all kinds of strange distribution , It can also cope with nonlinear transformations .

Parametric estimation and nonparametric estimation : Take the following example , Suppose there is a distribution , Then the parameter filter will first determine that it is a Gaussian distribution , And then we get its probability density function ( By mean 、 Variance and other parameters ); Particle filter does not bother about the specific function of this distribution , Instead, a large number of samples are sampled from this distribution , Use samples to describe this distribution . So why can particle filter do this ?

 Insert picture description here

Monte Carlo sampling : Intuitionistic theory , Suppose we can sample from a target probability distribution to a series of samples , Then we can use these samples to estimate some properties of this distribution . Here I think of the law of large numbers , And the concept of experimental frequency and probability of coin tossing . In the coin toss experiment , If you only throw it a few times , There is no regularity in the number of positive and negative occurrences , But if you throw enough times , The frequency of positive and negative sides is closer and closer to the real value . Pictured above is an example , I sample from a one-dimensional probability distribution , Then the sample is more likely to appear near the Gaussian mean , So the samples there are dense ; And the further away from the mean , The more sparse the sample is . If every sample I collect , Add... To the corresponding position 1, Then draw the frequency , When the sample is large enough , The shape should be similar to the distribution shape .

So in turn , Although I don't know the specific shape of the distribution , But the denser the sample is in a certain interval , Then the greater the probability of that interval , The more likely it is peak. So let's put it figuratively , The sample can in turn deduce what the distribution looks like .
 Insert picture description here

2. Particle filter algorithm

technological process

stay particle filter in , The particle set is represented as follows :

here , Particles are sampled from a posterior distribution , Every particle is t A possible assumption of the state of time . So the idea of particle filter is to use particle set to approximate the posterior distribution . that , The probability that a hypothesis will be selected to join the particle swarm is related to the posterior distribution :

 Insert picture description here

Due to the peak ( Corresponding to the truth value ) Nearby particles are easier to collect , It means that the more particles fall into a certain range , The easier the state truth value falls into this range . The standard particle filter algorithm mentioned above , When a particle tends to infinity , For Limited M, Particles will sample from slightly different distributions . In practice, if the number of particles is not less than 100, The difference is negligible .

t The posterior distribution of time is determined by t Particle set description of time ,t-1 The posterior distribution of time is determined by t-1 Particle set description of time . According to Bayesian filtering , Particle filter is to recursively find the particle set : The input is the particle set of the previous time , Control of the current moment , observation , The output is the current particle set

 Insert picture description here
 Insert picture description here

Let's analyze this code :

1. Input

2. Define two Empty particle set

3. Yes t-1 The moment M Particles are processed one by one

4. Apply control to each particle ( Including noise ), A new particle is obtained by sampling in the state transition distribution . all M New particles form a priori set

5. Calculate the importance factor for new particles ( That is, the weight ), That is, the possibility that a particle can get the current observation in its current state . It will be mentioned later , It is very important .

6. Combine particles with their importance factors

7. The loop ends

Resampling section

8~11. Sample from particle set M Time , To generate a new particle set of equal size . Which particle in the particle set is selected depends on its weight , Heavy particles are more likely to be drawn ( Even repeatedly draw ). In the process , The particles slowly gather in certain areas . Resampling end , The new particle set formed a posterior set .

12. Returns the particle set

We see , Particles are not sampled directly from the posterior distribution , Actually, the algorithm is trick It is here for resampling , Even if it is not directly sampled from the posterior distribution , But the transformed particles after resampling obey a posteriori distribution .

Resampling further understanding

hypothesis f Is an unknown distribution , Call it the target distribution ,g Is a known distribution , Call it the proposed distribution . The problem now is , How to distribute by proposal g The set of particles obtained obeys the target distribution f Particle set of ? It will be well understood in combination with the following figure . First , from g A batch of particles is sampled from the , It is indicated by a blue vertical line :

img

And then calculate f and g The degree of mismatch between , That is, the weight factor w

 Insert picture description here
 Insert picture description here

img

Then resample according to the weight , Because the vertical line is high ( Corresponding to the great power ) Particles of are easier to select , So the resampled particles are surrounded by the vertical line , The distribution changes , Change to obey the target distribution f.

img

 Insert picture description here

3. Summary

Why do others understand so thoroughly …

Have a good weekend ~

Article transferred from :https://blog.csdn.net/setella/article/details/82912604

原网站

版权声明
本文为[From spring to winter]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/163/202206121451136109.html