Resume (Oct 2022), CV (Mar 2023), mjragone (at) ucdavis.edu, Twitter, LinkedIn, Publications (scroll to bottom)

My name is Michael Ragone, and I’m a fifth year math PhD student at UC Davis with advisor Bruno Nachtergaele. I work at the intersection of quantum many-body physics, quantum information theory, and quantum computation. My dissertation project lands in the mathematics of quantum phases: these are the quantum extensions of solids, liquids, and gases from elementary school, and they’re fascinating! At the moment I’m particularly interested in probing topological phases of matter, exotic phases which could theoretically allow us to build quantum computers with built-in error correction.

More generally, we’re interested in a class of models known as “quantum spin systems”, a mathematical framework arising from condensed matter physics and quantum information theory that models a variety of systems, for example ions in a crystal lattice or electrons in a semiconductor. The math is rich and vibrant, pulling from varied fields like functional analysis, representation theory, and a whole lotta linear algebra. I’m generally fond of math inspired by the natural world–the universe asks some pretty great questions!

I am also working alongside Isaac Kim to study the relationship of the so-called Modular commutator with quantum mutual information. There are some tantalizingly suggestive numerics that hint at a relationship between the two, but an exact connection remains to be seen.

Los Alamos National Laboratory, T-Division

In Summer 2022, I attended the Quantum Computing Summer School. Besides learning a ton about quantum computing from staff scientists, postdocs, and fellow students alike, I worked on two major teams, and continue to work on the first:

  • The geometric quantum machine learning (GQML) team, consisting (in no particular order) of Marco Cerezo, Patrick Coles, Frederic Sauvage, Martin Larocca, Quynh Nguyen, Louis Schatzki, Paolo Braccia, and myself. GQML seeks to leverage powerful tools from the representation theory of continuous groups to construct quantum machine learning paradigms with inductive biases, which in turn promise to alleviate central challenges like barren plateaus, poor local minima, and sample complexity. We wrote a series of papers: a theory paper establishing a cohesive framework for equivariant neural networks, which morally imports classical convolutional neural networks to the quantum domain; a numerics paper demonstrating the advantage of inductive biases for quantum phase classification tasks; and an expository paper for scientists to learn representation theory as it relates to QML.
    • The symmetry program has a lot to say about GQML, and a lot of low hanging fruit. Send us an email if you want to go picking with us!
  • The mixed classical-quantum simulation team, consisting of Andrew Sornborger, Javier Gonzalez Conde, Joe Gibbs, and myself We are working towards a paradigm for simulation of mixed classical-quantum systems for noisy intermediate-scale quantum (NISQ) era. Existing approaches are either fully quantum and demand unrealistic resources (so not NISQ-friendly), or evolve quantum systems as perturbations of classical models without “backreaction’’, which becomes increasingly important as the quantum system grows in size. This ongoing project consists primarily of analytic work accompanied by Matlab simulations.


Curious Dimerization in a Class of SO(n)-Invariant Matrix Product States, B. Nachtergaele, MR (in preparation)

A Theory for Equivariant Quantum Neural Networks, Q. Nguyen, L. Schatzki, P. Braccia, MR, F. Sauvage, P. Coles, M. Larocca, M. Cerezo (arXiv:2210.08566)

Representation Theory for Geometric Quantum Machine Learning, MR, P. Braccia, Q. Nguyen, L. Schatzki, P. Coles, F. Sauvage, M. Larocca, M. Cerezo (arXiv:2210.07980)

The Power of Quantum Convolutional Neural Networks, P. Braccia, F. Sauvage, Q. Nguyen, L. Schatzki, MR, P. Coles, M. Larocca, M. Cerezo (in preparation)

Selected Talk Slides

2023 NC State Quantum Workshop on Quantum Machine Learning: Representation Theory for Geometric Quantum Machine Learning (slides)

Old Stuff: Computational Neuroscience

For a good chunk of my college career at the University of Arizona, I researched computational neuroscience in the Computational and Experimental Neuroscience Lab (CENL) under Dr. Jean-Marc Fellous. We developed a biophysical model of the rat’s spatial navigation system, and we collaborated with the Laboratory for Information Processing Systems (LIPS) to investigate coding theoretic properties of place cell networks and sharp-wave ripple events. Here’s our abstract from SFN 2016. There’s lots of other interesting work happening in both labs—check them out!

Old Stuff: Engineering Senior Design

My engineering senior design team at the University of Arizona created a machine learning denoiser for General Dynamics for the Coast Guard. You see, the Coast Guard regularly receives distress calls from boats that are…well, distressed. These radio signals are often made noisy by atmospheric interference, so the Coast Guard manually filters these signals until they are listenable. We were tasked with finding a better solution using machine learning. So, we created a framework wherein noisy audio signals are processed, converted into a form that highlights vocal features, and fed to a specially trained autoencoder. Our results show promise, and I suspect that following some refinement at General Dynamics, the Coast Guard may have a powerful new tool for incoming calls.

Random stuff

I’m also a huge coffee and food nerd. I’ve scattered pictures of stuff I’ve made around the website—if you have coffee/food/music suggestions, I’d love to hear them!

%d bloggers like this: