• On the theory side, we accelerate and automate symbolic calculations using AI and high-performance computing. For instance, this approach has enabled us to uncover a wide range of novel neutrino physics models.
  • In experiments, we design statistical techniques that embed physics knowledge to maximize the information we can extract from our experiments. For instance, one can handle negative quantum interference effects that plague Higgs physics with neural simulation-based inference. To deploy such powerful algorithms, one also needs to ensure their robustness and reliability. We therefore design practical uncertainty quantification to address challenges of trustworthiness.
  • On the core statistics side, we explore new frequentist statistical tests that can incorporate additional physics knowledge in ways the likelihood ratio test cannot, with the potential to transform both measurements and searches for new phenomena across experiments.

In addition to designing methods, we work closely with experiment collaborations (e.g., at LHC and DUNE) to deploy these methods on data. We also work with national laboratories to scale our methods on high-performance computing clusters.