Physics for AI

Quantum Physics

Quantum Matter

At DeepPsi we develop frontier AI for quantum science.

Built around the group of Prof. Liang Fu at MIT, our mission is to transform quantum physics and material science with a new generation of AI.

About us
A view of MIT at night
Check out our public code repository
Read the documentation
Our latest publications
See all publications

Predicting magnetism with first-principles AI

Max Geier, Liang Fu

Computational discovery of magnetic materials remains challenging because magnetism arises from the competition between kinetic energy and Coulomb interaction that is often beyond the reach of standard electronic-structure methods. Here we tackle this challenge by directly solving the many-electron Schrödinger equation with neural-network variational Monte Carlo, which provides a highly expressive variational wavefunction for strongly correlated systems. Applying this technique to transition metal dichalcogenide moiré semicondutors, we predict itinerant ferromagnetism in WSe2/WS2 and an antiferromagnetic insulator in twisted Γ-valley homobilayer, using the same neural network without any physics input beyond the microscopic Hamiltonian. Crucially, both types of magnetic states are obtained from a single calculation within the Sz=0 sector, removing the need to compute and compare multiple Sz sectors. This significantly reduces computational cost and paves the way for faster and more reliable magnetic material design.

Read more

First-Principles AI finds crystallization of fractional quantum Hall liquids

Ahmed Abouelkomsan, Liang Fu

When does a fractional quantum Hall (FQH) liquid crystallize? Addressing this question requires a framework that treats fractionalization and crystallization on equal footing, especially in strong Landau-level mixing regime. Here, we introduce MagNet, a self-attention neural-network variational wavefunction designed for quantum systems in magnetic fields on the torus geometry. We show that MagNet provides a unifying and expressive ansatz capable of describing both FQH states and electron crystals within the same architecture. Trained solely by energy minimization of the microscopic Hamiltonian, MagNet discovers topological liquid and electron crystal ground states across a broad range of Landau-level mixing. Our results highlight the power of first-principles AI for solving strongly interacting many-body problems and finding competing phases without external training data or physics pre-knowledge.

Read more

Fermi Sets: Universal and interpretable neural architectures for fermions

Liang Fu

We introduce Fermi Sets, a universal and physically interpretable neural architecture for fermionic many-body wavefunctions. Building on a ``parity-graded'' representation [1], we prove that any continuous fermionic wavefunction on a compact domain can be approximated to arbitrary accuracy by a linear combination of K antisymmetric basis functions--such as pairwise products or Slater determinants--multiplied by symmetric functions. A key result is that the number of required bases is provably small: K=1 suffices in one-dimensional continua (and on lattices in any dimension), K=2 suffices in two dimensions, and in higher dimensions K grows at most linearly with particle number. The antisymmetric bases can be learned by small neural networks, while the symmetric factors are implemented by permutation-invariant networks whose width scales only linearly with particle number. Thus, Fermi Sets achieve universal approximation of fermionic wavefunctions with minimal overhead while retaining clear physical interpretability.

Read more