0703-bias from wrong covariance matrix estimation
Last updated
Last updated
Chi-Squared Function: The chi-squared ($\chi^2$) function is defined by:
where:
$\mathbf{d}$ is the data vector.
$\mathbf{m}(\mathbf{p})$ is the model prediction vector dependent on parameters $\mathbf{p}$.
$\mathbf{C}$ is the covariance matrix of the data.
Derivative Calculation: To compute the gradient $\nabla_{\mathbf{p}} \chi^2$, consider the residual vector $\mathbf{r} = \mathbf{d} - \mathbf{m}(\mathbf{p})$. The derivative of $\chi^2$ with respect to a parameter $p_i$ is:
Applying the product rule, this becomes:
Since $\mathbf{r} = \mathbf{d} - \mathbf{m}(\mathbf{p})$, it follows:
Therefore, the gradient expression simplifies to:
where $\mathbf{J}$ is the Jacobian matrix of $\mathbf{m}(\mathbf{p})$ with respect to $\mathbf{p}$.
If $\mathbf{C}$ is purely diagonal, then:
The chi-squared function simplifies to:
The derivative with respect to $p_j$ is:
In summary, ignoring non-diagonal elements of the covariance matrix when they are significant (i.e., when data points are correlated) simplifies the calculation but can lead to incorrect parameter estimates. This approach assumes all measurements are independent, potentially overlooking crucial structural information about the data's variability and leading to biased results in parameter estimation.