Active Inference

Mathematical Background

sk_victoria 2023. 8. 27. 06:18

Reference: https://www.youtube.com/watch?v=YVDAodLNRXs&list=PLZ9Bz1i4njbXfu3qHpZh5ulUM3hutcLBk&index=42 


1. Multiplication rule

 

2. Marginalization rule

 

3. Chain rule

 

4. Expected value (= weighted average)

 

5. KL divergence

  • Kullback-Leibler Divergence
  • Indicates "How much two probability distributions diverge"
  • An approximate distribution mimicking the true distribution comes first, and the true distribution of x comes after.
  • In the below, q(x) indicates the approximate distribution and P(x) indicates the true distribution.
  • P is usually a complex distribution form, while q is set to some parameter model.
  • q is often a manageble, thus simplest form (ex. gaussian distribution).

  • Because of the sequence of the two distribution, an asymmetry in KL divergence computation occurs as below.

  • KL divergence should be always greater than or equal to zero.

 

6. Taylor expansion

 

7. Free Energy

Super helpful Reference: https://www.youtube.com/watch?v=APbreY1B5_U