BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//aichemy - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:aichemy
X-ORIGINAL-URL:https://aichemy.ac.uk
X-WR-CALDESC:Events for aichemy
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20240101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20251210T140000
DTEND;TZID=UTC:20251210T150000
DTSTAMP:20260430T211347
CREATED:20251125T132616Z
LAST-MODIFIED:20260325T135926Z
UID:6325-1765375200-1765378800@aichemy.ac.uk
SUMMARY:AIchemy’s Monthly Webinar Series - December 2025
DESCRIPTION:KEY DETAILS\n\n\n\n\nDATE10 December 2025 TIME14:00 – 15:00 COSTFree \n\n\n\n\nRECORDINGSClick the YouTube links below to watch each session. \n\n\n\n\nLarge-scale Crystal Structure Prediction: Learning from 1\,000 molecules and beyond Retention Is All You Get (But Maybe It’s All You Need) \n\n\n\n\n\n\n\n\nWe are delighted to welcome you to our AIchemy Hub’s monthly webinar series. \n\n\n\nThis month’s talks: \n\n\n\nProf. Keith Butler – University College of London \n\n\n\nTalk Title: Retention Is All You Get (But Maybe It’s All You Need): Using Large Language Models to Design and Discover New Materials \n\n\n\nLarge language models (LLMs) have transformed how we work with text\, but their underlying mechanism\, autoregressive next-token prediction\, naturally extends to any domain that can be expressed as a sequence. In this webinar\, Keith will explore how this paradigm can be repurposed for chemistry and materials science by treating crystal structures as a “language” and training LLMs to generate them. \n\n\n\nHe will discuss his recent work developing CrystaLLM\, an autoregressive model trained on large collections of crystallographic data. The model learns the statistical grammar of known materials well enough to generate syntactically valid and chemically plausible crystal structures. However\, detailed interrogation shows that the model’s apparent creativity is predominantly driven by retention\, recombining motifs seen in its training data rather than building a genuine\, generalisable “world model” of chemistry. This distinction is important for how such models are interpreted and deployed in discovery workflows. \n\n\n\nKeith will then introduce his team’s latest extensions using conditional generation\, which allow them to steer the model with property targets or experimental measurements. This approach does not magically endow the model with chemical reasoning\, but it provides a powerful way to exploit its learned structural priors. He will illustrate this with examples such as conditioning on X-ray diffraction patterns to accelerate structure solution and conditioning on target optoelectronic properties to bias generation toward functional materials spaces. \n\n\n\nOverall\, the aim of Keith’s talk is to provide a realistic\, scientifically grounded view of what LLMs can and cannot do for chemical discovery. These models are powerful tools for pattern learning and hypothesis generation\, but they do not yet constitute autonomous scientific reasoners. Understanding this helps researchers design workflows where they offer genuine advantage without overstating their capabilities. \n\n\n\nChris Taylor – University of Southampton \n\n\n\nTalk Title: Large-scale Crystal Structure Prediction: Learning from 1\,000 molecules and beyond \n\n\n\nComputational molecular crystal structure prediction (CSP) is a mature and powerful tool in materials discovery\, able to successfully predict and rank the possible crystal polymorphs of a range of functional materials at increasingly large scale. In this talk\, I describe our landmark study carrying out thorough CSP explorations on over 1\,000 rigid molecules with experimentally-known forms\, demonstrating our CSP workflow’s overwhelming success in predicting and ranking known forms\, and in rationalising empirical crystal engineering rules. I also demonstrate the potential of such large-scale data generation by presenting a machine-learned energy correction and a message-passing (MACE) neural network potential trained on this data\, as examples of the possibilities for employing AI trained on such datasets to empower functional materials discovery. \n\n\n\n\n\nSpeakers\n\n\n\n\n\nProf. Keith ButlerAssociate Professor in Computational Materials Chemistry\n\n\n\n\n\nDr. Chris Taylor Postdoctoral Research Fellow\n\n\n\n\n\nJohn WardWebinar Host
URL:https://aichemy.ac.uk/event/aichemys-monthly-webinar-series-december-2025/
CATEGORIES:Webinar
ATTACH;FMTTYPE=image/png:https://aichemy.ac.uk/wp-content/uploads/2025/11/Dec-25-Webinar-ft-image.png
END:VEVENT
END:VCALENDAR