For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.

Project 1: Metamaterials designed by AI for Sustainable Steel

Staff: Corentin Coulais (, Jan-Willem van de Meent (
Institutes: Institute of Physics, Institute of Informatics

In this project, we will develop deep learning methods to accelerate the computational design of dissipative metamaterials that will in turn enable new technology for sustainable steel.

Computational design of metamaterials will require solving fundamental research problems at the intersection of materials science and AI research. The response of steel metamaterials is highly nonlinear, as it involves a mechanical instability called buckling, irreversible plastic deformations and self-contact interactions. Previous work by Coulais demonstrated that convolutional neural networks are extremely efficient at classifying metamaterials and can be combined with genetic algorithms to find materials with optimal properties. This project will build on this line of work, which was carried out in the static linear regime, to reason about dynamic nonlinear responses.

WP1: Learning a coarse-grained model from finite-element simulations. The technical challenge in modeling the dissipative dynamics of metamaterials is that numerical simulation with traditional finite-element methods is prohibitively expensive, even for a single configuration. To address this challenge, we will begin by developing AI methods that approximate dynamics at the level of a unit cell. The input to the model will be a large collection of small-scale finite-element simulations, which can be generated relatively cheaply. To account for memory-effects in the material, we will train a neural sequence model, such as a recurrent neural network, to predict the pairwise forces between the elements of the coarse-grained model in a matter that mirror the finite-element dynamics.

WP2: Learning a surrogate function for the coarse-grained dynamics. A coarse-grained model will reduce O(50k) finite elements into O(10) coarse-grained particles per unit cell. This will make it possible to simulate dissipative dynamics for multiple configurations. However, it will still not be feasible to test all configurations that need to be considered during optimization. To screen candidate configurations, we will train a fast surrogate model to predict macroscopic response variables from coarse-grained simulations. This model will take the form of a graph neural network that is equivariant, viz. where translations and rotations of inputs are baked into the networks. These equivariant architectures, which have in part been pioneered at AMLab, will allow us to train a surrogate model from a comparatively small number of coarse-grained simulations by generalizing across inputs that are equivalent up to rotations and translations.

WP3: Bayesian combinatorial optimization of meta-material configurations. Our ultimate goal is to define an outer optimization loop that maximizes shock-damping performance subject to a carbon footprint budget, or conversely minimizes the carbon footprint budget subject to damping performance requirements. This involves repeatedly running coarse-grained simulations in a manner that balances exploration (testing new configurations with uncertain properties) and exploitation (improving on the current best candidate). To this end, we will employ the learned surrogate model to compute similarities between unseen candidate configurations and previously simulated configurations. The resulting Bayesian optimization procedure should allow us to identify an optimum based on O(100) simulations.

Expected outcomes and Impact: Metamaterials are currently at an early technology readiness level, so this project carries risks that are inherent to all fundamental research. However solving these fundamental challenges can substantially contribute to the greenification by (i) improving vehicle efficiency through the development of stronger and lighter metamaterials that meet the energy-adsorption requirements for aerospace and automotive applications, (ii) enabling construction of parts that employ lower-grade steel as the base material, which takes less CO2 to produce than high-grade steels and other metals.

Project 2: Salt hydrates for thermal energy storage 

Staff: Noushine Shahidzadeh (, Alberto Pérez de Alba (, Sander Woutersen (
Institutes: Institute of Physics, Van ’t Hoff Institute for Molecular Sciences, Institute of Informatics

Do you enjoy working in interdisciplinary research at the crossroads of physics, chemistry, and artificial intelligence (AI)? The Institute of Physics (IoP), the Van ’t Hoff Institute for Molecular Sciences (HIMS) and Institute of Informatics (IvI) have joined forces and are looking for an ambitious postdoctoral researcher to complement the team working on salt hydrates for sustainable thermal energy storage (TES).

Buildings consume about 30% of all global energy; using most of it for thermal end uses. Changing the current gas- and electricity based heat supply into a renewable-based supply requires the development of novel technologies to store energy. To meet this challenge, we look for a researcher to work on TES materials as a key technology to achieve a system that is efficient, resilient, and affordable. We will focus on salt hydrates, which have the highest energy densities among smart thermochemical materials. Based on a reversible dehydration / hydration reaction, salt hydrates can, for example, store heat during a warm day (cooling) and release it when it is cold at night (heating). However, the deployment of these materials in a sustainable way is still limited by a number of technical challenges such as a broad transition temperature range, supercooling hysteresis and degradation with cycling (i.e., irreversible phase transformations). The main scientific challenge is to determine optimal design rules to maximize reversibility of cycling and transition temperature range in salt hydrates for TES. The design space is however extremely large: different salt hydrates, which can be mixed, have different crystalline structures, and the microcrystallite size of the salt is also important for the speed and efficiency of (de)hydration.

What are you going to do?
To meet this challenge, AI should be combined with experimental techniques such as calorimetric, Raman, IR spectroscopy, X-ray, and molecular simulations in a feedback loop to be able to efficiently exploit the vast design space. The parameters from the experiments, together with structural and dynamical molecular fingerprints, will be used to feed machine learning algorithms to predict and manage TES materials and their properties. Feature analysis will reveal which are the key aspects to look for in optimal salt hydrates. More importantly, in the AI part, we will go beyond regression and train generative models able to propose new salt hydrates with a high predicted performance. We will evaluate the use of models that have been ground-breaking in protein engineering, drug design and small molecule design, such as variational autoencoders, long short-term memory networks and diffusion models. Additionally, within collaborations, we will evaluate the speedup of replacing quantum chemistry calculations with machine-learned potentials and extract thermal properties from molecular dynamics simulations.

Project 3: Deep representation & simplification of chemical additives for safe-and-sustainable-by-design plastics

Staff: Saer Samanipour (, Antonia Praetorius (, Patrick Forré (
Institutes: Van ’t Hoff Institute for Molecular Sciences, Institute for Biodiversity and Ecology Dynamics, Institute of Informatics

Plastics play a central role in our society due to their low production costs, high versatility and malleability as materials. Their desirable properties are achieved via specific combinations of the polymer matrix and various chemical additives (plasticizers, pigments, etc). However, these additives—of which many are hazardous to human and environmental health—can leach out of the polymer matrix and cause adverse effects. Moreover, the high complexity of additives used in plastics hampers effective re-use and recycling strategies and limits realistic pathways towards circular and sustainable plastics. In parallel to ongoing negotiations for a Global Plastic Pollution Treaty there have therefore been clear calls for simplifying the chemical fingerprint of plastics and phasing out toxic chemicals. In this context, utilizing the approach of Safe-and-Sustainable-by-Design (SSbD) for the development of alternative plastic materials is the logical way forward, as it places the assessment of safety and sustainability of the newly designed materials (together with its desired function) into the early stages of the design process. A key step here would be the identification of a subset of additives, which fulfill key functions—in a wide range of polymer types and plastic applications—while being safe with respect to human and environmental health.

Scientific challenges: There are several challenges related to the simplification of the plastic additive chemical space to enable the production SSbD plastics. These challenges are mainly due to the complexity of this space—10 000 chemicals have been identified as potentially used in plastic production—as well as the lack of experimental data on environmental fate and toxicity. Additionally, current models for in-silico prediction of chemical properties pertinent to hazard or environmental fate assessment are based on quantitative structure activity relationships (QSARs), which suffer from very limited applicability domain, sparse training sets, and too restrictive assumptions. This makes them error-prone and poorly applicable to a wide range of chemicals differing strongly from the training set. Additionally, these QSAR models tackle one toxicity/fate parameter at a time, which has been shown to be inadequate to assess safety and sustainability of chemicals (e.g. PFAS). Recent applications of machine learning combined with Bayesian network models have shown a great potential in providing a more accurate assessment of the fate and toxicity of chemicals. However, the current applications require extensive measurements both for fate/toxicity and environmental occurrence.

Objectives & approach: In this project an interdisciplinary team of (environmental) chemists and data scientists at the University of Amsterdam will tackle these challenges together with a network of external partners from academia, regulation and industry. We propose a data-driven approach towards simplifying the suite of chemical additives used in plastics to support the development of Safe-and-Sustainable-by-Design (SSbD) polymeric materials. We will develop advanced computational tools to make use of the 3D structure of chemical additives to score them according to specific SSbD criteria, while taking into account the polymer functionality. The final goal is to generate a short-list of safe chemical additives covering a range of key functions which can be used as to formulate simplified and harmonized plastic formulation across a wide range of application sectors. Finally, these models will be applied to the chemical space outside of the known plastic additives to identify potential novel SSbD plastic additives.

Project 4: Machine Learning-based models of plant protein mixtures for sustainable food design

Staff: Peter Bolhuis (, Herke van Hoof (, Sara Jabbari-Farouji (, Francesca Quattrocchio (, Peter Schall (, Alberto Pérez de Alba Ortíz (
Institutes:  Van ’t Hoff Institute for Molecular Sciences, Institute of Physics, Institute of Informatics, Swammerdam Institute for Life Sciences

Providing sustainable and healthy nutrition for the growing global population is a major challenge. Plant-based protein products can serve as sustainable alternatives to animal proteins, offering space and CO2-saving substitutes. Emerging plant proteins such as RuBisCo, pea, and potato proteins are becoming increasingly important in the nutrition industry. It is highly desirable to extract proteins from plant waste. However, food preparation is a complex process involving various proteins with different solubility and aggregation points. For example, heating causes proteins to denature and form aggregates, which imparts glass or gel-like properties onto the protein mixture, essential to food quality and consumer experience. Moreover, these protein mixtures can show completely unexpected properties that cannot be interpolated from their components.

Efficient large-scale modelling of complex protein mixtures will be tremendously important in the design of sustainable food. Atomistic simulations allow modelling of protein interactions on a small scale, but they are limited in their ability to reach experimentally relevant length scales. Combining machine learning (ML) and experimental/simulation data allows the development of transferable coarse-grained (CG) models of proteins as colloidal particles with increasing complexity, and to use these models to predict properties of protein mixtures. The models will range from simple isotropic potentials to more complex anisotropic "patchy" particles with specific and non-specific binding patches characterized by patch size, strength, and interaction range. The effective CG models will be trained using databases, atomistic simulations, and experimental data. To address the challenge of optimizing parameters in high-dimensional spaces, ML-based approaches such as Bayesian optimization and reinforcement learning will be employed. The goal here is to accurately describe the behavior of protein mixtures with as little added complexity as possible.

The final aim is to develop an AI-driven model that is able to predict the aggregation and rheological behavior of naturally occurring complex protein mixtures in plants to design plant-based food with optimal taste, nutritional value, and stable texture.

In this highly multidisciplinary PD project, you will address the following research topics:

  1. Develop a workflow to optimize the model using ML and forward modeling.
  2. Optimize models for specific proteins based on the available experimental and simulations input data.
  3. Perform large scale simulations, and investigate the sensitivity of properties to model parameters.
  4. Validate suggested changes by the model (optionally in experiments).