AIchemy’s Monthly Webinar Series – Oct 25


KEY DETAILS

  • DATE

    22 October 2025

  • TIME

    15:00 – 16:00

  • COST

    Free

This month’s talks:

Standard machine learning interatomic potentials (MLIPs) often rely on short-range approximations, limiting their applicability to systems with significant electrostatics. We recently introduced the Latent Ewald Summation (LES) method, which learns long-range electrostatics from *just energy and force data*. We show that LES can effectively infer physical partial charges, polarization and Born effective charge (BEC) tensors, as well as achieve better accuracy compared to methods that explicitly learn charges. As demonstrations, we predict the infrared spectra of bulk water under zero or finite external electric fields, ionic conductivities of high-pressure superionic ice, and the phase transition and hysteresis in ferroelectric PbTiO3 perovskite. 


Yuxing Zhou – University of Oxford

Talk Title: Device-scale simulations of memory materials enabled by fast and accurate interatomic potentials

Atomistic simulations play an important role in understand fundamental properties and working mechanisms of phase-change materials (PCM)-based devices. Our recent work has shown that machine-learning (ML)-driven molecular dynamics simulations enable accurate description of Ge–Sb–Te alloys, particularly for compounds on the GeTe–Sb2Te3 tie-line (GST)1. Using an ML potential based on the Gaussian approximation potential (GAP) framework, we demonstrate a device-scale RESET (“1→0”) simulation over 50 ps in a device-scale model of 532,980 atoms (corresponding to a real device size of 40 × 20 × 20 nm3; Fig. 1). However, realistic switching operations in GST devices usually take tens of nanoseconds. More importantly, non-isothermal conditions are prominent in GST devices, which can lead to distinct SET or RESET states as compared to isothermal conditions, thus complicating accurate modelling of phase transitions in real devices.

In this talk, I will demonstrate full-cycle device-scale simulations of GST devices under realistic programming conditions. I will introduce a new ML potential based on the Atomic Cluster Expansion (ACE) framework2. The new ACE potential is more than 400 times faster than the GAP potential, which enables full-loop simulations (multiple RESET to SET operations) of cross-point and mushroom-type devices at extensive length scales (involving sub-million atoms) and time scales (tens of nanoseconds). Next, I will present a new simulation protocol that describes non-isothermal conditions and temperature gradients of any desired level of spatiotemporal complexity. Based on these ML-driven MD simulations, we show temperature-dependent crystallisation behaviours of GST, elucidating the interplay between nucleation and growth under non-isothermal crystallisation in GST memory devices. This talk presents a platform for the predictive modelling of PCM-based memory devices, and more widely, it highlights the power of highly scalable atomistic machine-learning models for modern materials science and engineering.

Speakers

Prof. Bingqing Chen

Prof. Bingqing Chen

Professor of Chemistry

Yuxing Zhou

Yuxing Zhou

PhD student

Dr. Tahereh Nematiaram

Dr. Tahereh Nematiaram

Webinar Host – Associate Professor