Weighted Averaging
Problem Statement
Consider a weighted averaging problem as shown in Figure 1. Let \(x \in \mathbb{R}\) be a constant hidden variable to be estimated and \(z_i\) be multiple measurements. Assume all measurements are i.i.d and perturbed by noise \(\mathcal{N}(0, \sigma_i)\).
Solving with Factor Graph
Weighting Matrix Properties
The measurement covariance matrix \(\mathbf{R}\) or equivalently the weighting matrix \(\mathbf{W} = \mathbf{R}^{-1}\) is symmetric. Hence, we can decompose it using Cholesky decomposition \(\mathbf{W} = \mathbf{S}_{\mathbf{W}} \mathbf{S}^T_{\mathbf{W}}\), where \(\mathbf{S}_{\mathbf{W}}\) is a real lower triangular matrix with positive diagonal entries.
Furthermore, since \(\mathbf{W}\) is a block diagonal matrix, the factors are conditionally independenty.
Mapping Weighted Least Squares to Least Squares
By pre-multiplying \(\mathbf{L}\) and \(\mathbf{z}\) by \(\mathbf{S}_\mathbf{W}\), we can use the same computational techniques as the regular least squares. Denote:
Then:
We have:
Finally, we compute the state estimate as:
Notes
The matrix \(\mathbf{M} = \left(\mathbf{L}^T \mathbf{W} \mathbf{L} \right)^{-1}\) is the covariance of the hidden variables. We can relate \(\mathbf{M}\) with the covariance \(\mathbf{P}\) of a KF:
- The ordering of \(\mathbf{M}\) corresponds with the columns of \(\mathbf{L}\) (hidden variables).
- Different blocks on the diagonal correspond with \(\mathbf{P}\) matrix over time of KF.
- In general, covariance blocks will be "smaller" than KF \(\mathbf{P}\) because factor graph is not causal. However, the "last \(\mathbf{P}\)" will be the same as the KF for a linear system.
- Off-diagonal elements express correlated errors over time. While the input to KFs is "white", the output is not.