Speaker
Description
Abstract: Score-based diffusion models represent rich image priors, but using them to solve inverse problems in imaging poses challenges. In this talk, I will address two challenges: (1) the intractability of exact posterior sampling with a score-based prior and (2) the fact that diffusion models often violate physical (e.g., PDE) constraints inherent in the training data. For (1), we propose using the exact log-probability function of a score-based diffusion model as the regularizer in variational inference. We apply this method to black-hole imaging and re-imagine the M87* black hole under different assumptions. For (2), we propose neural approximate mirror maps for constrained diffusion models. By learning an approximate mirror map, we can train diffusion models in an unconstrained space that satisfy the constraint through an inverse mirror map. This approach works for general constraints and generative models. We demonstrate applying it to solve constrained inverse problems, such as data assimilation with PDE constraints.