Abstract:The problems of recovering the state of power systems and detecting the instances of bad data have been widely studied in literature. Nevertheless, these two operations have been designed and optimized for the most part in isolation. Specifically, state estimators are optimized based on the minimum mean-square error criteria, which is only optimal when the source of distortions in the data is Gaussian random noise. Hence, the state estimators fail to perform optimality when the data is further contaminated by bad data, which cannot necessarily be modeled by additive Gaussian terms. The problem of power state estimation has been studied extensively. But the fundamental performance limits and the attendant decision rules are unknown when the data is potentially compromised by random bad data (due to sensor failures) or structured bad data (due to cyber attacks, which are also referred to false data injection attacks). This paper provides a general framework that formalizes the underlying connection between state estimation and bad data detection routines. We aim to carry out the combined tasks of detecting the presence of random and structured bad data, and form accurate estimations for the state of power grid. This paper characterizes the optimal detectors and estimators. Furthermore, the gains with respect to the existing state estimators and bad data detectors are established through numerical evaluations.