Category: News

ESPResSo 5.0.0 released

This is a major release. New features were added and deprecated features were removed. The API has changed, and some of these changes are silent, i.e. warnings aren’t necessarily emitted when running a script designed for ESPResSo 4.x that relies on features that have significantly changed in in ESPResSo 5.0.

Highlights of the release include:

  • rewrite of lattice-Boltzmann and electrokinetics using solvers backed by the waLBerla framework. For LB, this includes enhanced capabilities including vectorization support on the CPU and multi-GPU support, per-cell boundary conditions to build arbitrary geometries, per-particle friction coefficients, and Lees-Edwards boundary conditions.
  • faster writing of simulation trajectories using the H5MD file format, particularly in parallel simulations.
  • per-particle selection of equations of motion.
  • the thermalized Stoner-Wolfarth model for magnetodynamics, and the ability to obtain local magnetic fields at the particles’ positions.
  • virtual sites tracking the center of mass of a group of particles for umbrella sampling.
  • initial support for shared-memory parallelism for some scenarios.
  • several new tutorials, e.g., on the sedimentation of particles in a fluid, the Boltzmann inversion technique, electrode modelling, and machine-learned inter-atomic potentials.

Please find the list of changes below. The numbers in brackets refer to ticket numbers on https://github.com/espressomd/espresso

Get the source code in the Download area.

Added functionality

  • The original LB and EK methods have been completely replaced with equivalent implementations based on the high-performance waLBerla library (#4726, #5101). This is a major API change that requires adapting all LB and EK scripts to use the new classes and arguments.
  • LB now support Lees-Edwards boundary conditions (#4977).
  • LB and EK methods now support setting boundary slip velocities on individual nodes (#4252).
  • LB now supports per-particle gamma (#4743). Only works for isotropic particles.
  • Thermostats and integrators have been redesigned as unified propagators (#4820, #4603). Multiple combinations of thermostats and integrators are now supported to solve multiphysics problems. While the Python interface remains mostly unchanged, internally the user-selected integrator is now the “main” integrator. Alternative integration schemes can then be enabled on a per-particle basis using the new propagation flag. An important consequence is that all virtual site types can now be enabled in the same simulation.
  • Magnetodynamics support was introduced with the thermal Stoner–Wohlfarth model (#5188). This is achieved through a virtual site that decouples the particle dipole from the particle quaternion.
  • A new virtual site implementation was introduced to exert forces on molecules through their center of mass, for example to implement umbrella sampling (#5199).
  • The OpenGL visualizer now uses different colors for arrows representing fluid velocities and slip velocities (#4252).
  • ESPResSo now supports the ZnDraw visualizer (#4967, #5115, #5217).
  • ESPResSo now has Atomic Simulation Environment (ASE) bindings (#4912), including calculators (#5162). One application is interfacing ESPResSo with machine-learned potentials.
  • The magnetostatics.DipolarDirectSumCpu() feature now works in a MPI-parallel simulation (#4559).
  • The magnetostatics.DipolarDirectSumCpu() feature now supports replicas via the new optional argument n_replicas (#4559).
  • The magnetostatics.DipolarDirectSumGpu() feature now supports replicas via the new optional argument n_replicas (#5094).
  • The magnetostatics.DipolarDirectSumCpu() and magnetostatics.DipolarDirectSumGpu() features can now calculate the total dipole field experienced by each particle (#4626, #5094). Requires feature DIPOLE_FIELD_TRACKING.
  • Particle-based observables ParticleDirectors() and ParticleDipoleFields() were added (#4627, #4626).
  • Particle bond energies can be calculated with system.analysis.particle_bond_energy() for a given bond and particle (#5040).
  • Particle neighbor lists can be extracted with system.analysis.particle_neighbor_pids() (#4662). This feature will help prototyping simulations that interface with machine-learned potentials, which take a list of particle positions as input and output the force on the central particle.
  • Observable PairwiseDistances and accumulator ContactTime were introduced to track the contact time, i.e. number of consecutive time steps during which two particles are closer than a cutoff value (#5032).
  • The bond breakage feature now supports angle bonds (#4716).
  • Tabulated interaction TabulatedNonBonded got a new method set_analytical() to automatically set the energy and force from the analytical expression of the potential using SymPy (#5019).
  • Instrumentation tools Caliper, CUDA-GDB and kernprof are now natively supported (#4747).
  • Instrumentation feature FPE (floating-point exceptions) is now natively supported on x86 and Armv8 architectures (#5020).

Changed requirements

  • The project now requires C++20 and CUDA 12 (#3918, #4612, #4931).
  • The build system now supports the Intel oneAPI C++ Compiler (#4532), the Cray Clang compiler (#5201), and the NVIDIA HPC SDK (#5257).
  • The waLBerla library is now a dependency for all LB and EK methods (#2701, #4726). If not found, it is built from sources automatically.
  • The heFFTe library is now a dependency for Coulomb P3M (#5063) and the EK FFT solver (#5101). If not found, it is built from sources automatically.
  • The Kokkos and Cabana libraries are now dependencies for shared-memory parallelism (#5074). If not found, they are built from sources automatically.
  • The OpenMP component of the FFTW3 library are now dependencies for shared-memory parallelism (#5086). This component is sometimes packaged separately from the MPI FFTW3 library on HPC clusters.
  • The HighFive library is now a dependency for hdf5 file I/O (#5087). It is built from sources automatically. The h5xx library is no longer a dependency.
  • The GNU GSL library is now a dependency for MMM1D (#5201).
  • The minimal version of all dependencies was increased (#4532, #4612, #4717, #4931, #5093, #5201, #5223): CMake >= 3.27.6, Python >= 3.11, Cython >= 3.0.4, Boost >= 1.83, CUDA >= 12.0, OpenMPI >= 4.0, MPICH >= 3.4.1, GCC >= 12.2, Clang >= 18.1, AppleClang >= 17.0, CrayClang >= 17.0, Intel oneAPI C++ Compiler >= 2023.1, and Python packages versions are pinned on versions available in the Ubuntu 22.04 repository. CUDA 12.6 and later versions are now supported (#5129).

Feature configuration at compile time

  • All project-specific CMake options have been renamed (#4612). This change was required to avoid name collisions with external projects. Please refer to the user guide chapter on installing ESPResSo to find out the new option names. Using the old option names will generate warnings, but CMake will carry on and use default values instead of the values you provided. Please don’t ignore these warnings when adapting your build scripts.
  • The CMake option ESPRESSO_CUDA_COMPILER was removed in favor of the environment variable CUDACXX (#4642).
  • A config file is now available to build the project automatically in Codespaces (#5201, #4531).
  • An AGENT.md is now available to guide agentic coding tools (#5220).

Improved documentation

  • A Widom insertion tutorial was added (#4546)
  • A lattice-Boltzmann sedimentation tutorial was added (#4570)
  • A machine-learned potentials tutorial was added (#4982).
  • An atomistic water simulation tutorial was added (#5174).
  • An electrodes tutorial with ICC/ELC/ELC-IC was added (#4784).
  • A Boltzmann inversion tutorial was added (#5187).
  • A Grand Canonical Monte Carlo tutorial was added (#4670).
  • The electrokinetics tutorial was completely rewritten and now features chemical reactions (#4782).
  • All tutorials were re-designed for JupyterLab (#4830). Reliance on Jupyter extensions and plugins has been significantly reduced in an effort to improve compatibility with other Jupyter backends. In particular, VS Code Jupyter is still actively supported. Jupyter Notebook (Classic Notebook) should still be compatible, although it is not actively tested. IPython is no longer supported.
  • Most tutorials adopted ZnDraw as the visualization backend (#4976, #4975).
  • A high-throughput computing sample based on the Dask scheduler was added (#4781).
  • All supported debuggers and profilers are now documented: Caliper, Valgrind, GDB, CUDA-GDB, kernprof, perf, UBSAN, ASAN (#4747).
  • Installation instructions were improved with better sectioning (#5062).
  • The CUDA 12 circular dependency in Ubuntu 24.04 packages is documented (#4642).

Interface changes

  • The original LB classes LBFluid and LBFluidGPU were removed in favor of a unified LBFluid class for both CPU and GPU (#2701, #4726, #5230). Their arguments have also changed, e.g. dens became density and visc became viscosity. The pressure_tensor_neq property was removed.
  • The original EK class Electrokinetics was removed in favor of a unified EKSpecies class for both CPU and GPU (#2701, #4726, #5101, #5230).
  • Self-propelled particles (swimmers) have been completely re-implemented (#4745). The propulsion mechanism can now only be set up with a force. When coupling to a LB fluid, a real particle and a virtual site are used to create the dipole.
  • The long-range actors API was completely redesigned (#4749).
  • CPU and GPU algorithms now have a unified Python class (#5230). Pass optional argument gpu=True to the constructor to select the GPU backend. For example, class espressomd.electrostatics.P3MGPU was removed in favor of espressomd.electrostatics.P3M, which now manages both the CPU and GPU backends. Likewise, DipolarDirectSum replaces both DipolarDirectSumCpu and DipolarDirectSumGpu.
  • The virtual sites API was completely redesigned (#4820, #4603).
  • The collision detection API was completely redesigned (#4987).
  • The Galilei transform API was completely redesigned (#4816).
  • Class attributes expecting 3 boolean values no longer accept integer values (#4541). It is no longer possible to set properties system.periodicity, particle.fix and particle.rotation with e.g. [1, 1, 1] or [0, 0, 1].
  • reaction_methods.ReactionAlgorithm.reaction() now takes steps instead of reaction_steps as argument for consistency with the MD integrator (#4666)
  • io.mpiio.Mpiio() now takes a system as argument (#4950).
  • analysis.pressure() and analysis.pressure_tensor() now take the DPD stress tensor into account into the total pressure, and got an additional member dpd (#5045).
  • analysis.energy(), analysis.pressure() and analysis.pressure_tensor() got additional members kinetic_lin and kinetic_rot to separate linear and angular kinetic energy/pressure (#5043).
  • cluster_analysis.ClusterStructure() now takes a system as argument (#4950).
  • interactions.ThermalizedBond() parameter seed was moved to system.thermostat.set_thermalized_bond() (#4845). This change better reflects the fact there is only one global seed for all thermalized bonds; until now this global seed was overwritten by any newly created thermalized bond, whether it was added to the system or not.
  • All P3M algorithms now accept an extra argument tune_limits to constrain the range of mesh values exploring during mesh size tuning (#5017).
  • The check_complex_residuals optional argument of the P3M algorithm was removed (#5189).
  • Python objects of type pathlib.Path can now be passed to functions that expect file paths (#5128).
  • Bonds breakage can now be triggered manually (#4995). This is meant to be used in Lees-Edwards simulations with a time-dependent shear, since bonds extending across the shear boundary can increase in length without a change in particle positions when the simulation time increases.
  • Analysis.particle_energy() was renamed to Analysis.particle_non_bonded_energy() to better reflect the calculated quantity, since kinetic, bonded, electrostatic and magnetostatic contributions are not part of this energy (#5226).

Removed functionality

  • The lb.LBBoundaries() framework was removed (#4381). Shapes can now be passed directly to LB and EK objects.
  • The magnetostatics.DipolarDirectSumWithReplicaCpu() method was removed, since the magnetostatics.DipolarDirectSumCpu() method now supports replicas (#4559).
  • The electrostatics.MMM1DGPU() feature was removed (#4928).
  • The magnetostatics.DipolarBarnesHutGpu() feature was removed (#4928).
  • The MDAnalysis bindings were removed (#4535)
  • The bind_three_particles collision mode was removed (#4823).
  • LB populations are no longer accessible from the Python interface (#5075).

Improved testing

  • The Armv8 architecture is now tested in CI (#5020).

Performance enhancements

  • LB now supports multi-GPU acceleration (#5007).
  • Observables are now fully MPI-parallel and show better performance than equivalent operations by hdf5 or MPI-IO writing on a SSD (#4748).
  • Reaction methods are now fully MPI-parallel and now only invalidate the system state after a batch of particle changes have been applied (#4666).
  • Performance of particle property getters and setters has improved, in particular vector quantities such as force and velocity are 25 times faster to read from and 3 to 4 times faster to write to (#5209, #5124, #5069).
  • The RegularDecomposition cell system no longers uses a ghost layer when the simulation has only 1 MPI rank (and any number of OpenMP threads), which improves performance of a Lennard-Jones simulation by 11% for 1 thread (#5157).
  • Shared-memory parallelism (OpenMP) is now supported in short-range force calculation (#4754, #5097), Coulomb and Dipolar P3M (#5086, #5189), LB and EK (#5083).

Bug fixes

  • UTF-8 strings are now supported in all features (#5128).
  • Updating an active non-bonded interactions via e.g. system.non_bonded_inter[0, 0].lennard_jones.set_params() now uses the default arguments when optional arguments are missing and raises an error when required arguments are missing (#4558). In previous ESPResSo versions, missing optional and required arguments would be recycled from the previous state of the non-bonded interaction (#4569).
  • Thermalized LB simulations are now fully decorrelated (#4845, #4848). In previous ESPResSo versions, the LB thermostat seed argument was actually used as the RNG counter, thus ensemble runs would produce almost the same trajectory.
  • Particle coordinates are now properly folded in the histogram and RDF classes to avoid off-by-one errors (#5109).
  • The particle_data.ParticleHandle() and io.writer.h5md.H5md() writer now use properly folded particle coordinates (#4940, #4944). In previous ESPResSo versions, cached coordinates would be used, which could be out-of-date when large Verlet list skin values were used.
  • Lees-Edwards now applies the offset in the correct direction and no longer requires the user to provide a shear_velocity multiplied by -1 (#5081).
  • Lees-Edwards boundary conditions now support the regular decomposition cell system via the new fully_connected_boundary argument (#4958).
  • The isotropic NpT algorithm was completely rewritten and now supports two barostats: Andersen (#5053) or MTK (#5077).
  • It is no longer possible to change the reaction constant of an existing reaction with a gamma value less or equal to 0 (#4666).
  • When setting up a reaction method with two or more reactions, a runtime error is raised if a reaction accidentally overwrites the default charge of a specific type with a different value (#4666).
  • It is no longer possible to add an angle bond or dihedral bond with a list partner particle ids containing duplicate entries, since the angle would be undefined (#5012).
  • Adding the same object twice in an ObjectList now raises a runtime error; removing an unknown object from an ObjectList now raises a runtime error (#4779). In previous ESPResSo versions, adding the same object twice in a list could have unintended side-effects.
  • The SimplePore distance function was corrected and no longer generates NaN values (#5016).
  • The FENE bond now breaks when compressed beyond its stretching limit (#5195).
  • Coulomb and Dipolar P3M algorithms no longer emit warnings nor trigger assertions when particles are close to the box boundaries, when running simulations on a CPU that supports extended precision floating-point numbers (#5136).
  • OpenMPI 5.0 now longer triggers random PRRTE errors at system exit (#5093).
  • Default-constructed Utils::Vector and Utils::Array objects are now properly zero-initialized (#5257). In previous releases, the default constructor could accidentally leave the underlying data uninitialized when using the NVHPC compiler toolchain and building at the -O2 or -O3 optimization level (#5263).
  • Thole corrections are no longer part of the energy calculated by Analysis.particle_non_bonded_energy() (#5226)

Under the hood changes

  • Most Cython files have been converted to Python files (#4541, #4713).
  • Cython 3 is now supported (#4845).
  • Sources of NaN, float overflow, and most float underflow were addressed (#5020).
  • GPU algorithms no longer leak device memory (#4741, #4764).
  • CPU implementations of the P3M algorithm no longer leak memory (#4947).
  • The CPU implementation of the P3M Coulomb algorithm was entirely rewritten using the heFFTe library (#5063). The method is now easier to modify and extend, supports shared-memory parallelism (#5086, #5189), and uses real-to-complex transforms (#5204).
  • Project-specific compiler diagnostics are no longer propagated to external projects like waLBerla (#4642).
  • The build system now relies on CMake’s native CUDA support (#4642).
  • The build system now installs the object-in-fluid Python module when espressomd is installed (#4931).
  • The script interface was massively simplified (#4816).
  • Most global variables were removed (#4741, #4783, #4816, #4845, #4950).
  • The ESPResSo repository can now be cloned without git flag --recursive (#5031). ESPResSo developers are now expected to integrate new third-party libraries using the CMake FetchContent mechanism instead of git submodules.
  • The build system now properly handles linking of ESPResSo against static and shared libraries, sets the correct runpaths, and avoids cyclic dependencies during the linking stage (#5221, #5173). These changes are most relevant to cluster admins and package maintainers.

pyMBE 1.0.0 released

We are pleased to announce the release of pyMBE v1.0.0 (doi:10.5281/zenodo.12102634), a major update of our open-source Python package for building and managing coarse-grained models of polyelectrolytes, peptides, proteins, and hydrogels in ESPResSo.

This release significantly expands pyMBE’s capabilities, introducing new tools for constructing complex molecular architectures such as hydrogels, improving internal consistency, and enhancing interoperability with modern Python and ESPResSo versions. Parameter sets from previous work are now directly included within the package, and the internal bookkeeping of molecular topologies has been streamlined.

Among the highlights of v1.0.0 are:

  • New methods for hydrogel generation and analysis, including dedicated sample scripts and benchmarks.
  • Improved data structures linking particles, residues, and molecules, with dedicated object deletion methods.
  • Enhanced logging, exception handling, and CI testing, ensuring more robust and transparent workflows.
  • Full support for NumPy 2, Pandas 2, and ESPResSo 4.2 and the development version of ESPResSo, and compatibility with Conda environments.
  • Updated tutorials and examples demonstrating molecule setup, post-processing, and visualization.

pyMBE continues to be developed and maintained by an active community of soft matter researchers interested in the molecular modeling of weak polyelectrolytes and biomacromolecules. We warmly welcome new users and contributors to join our efforts and help shape future releases!

Learn more about pyMBE in our publication at The Journal of Chemical Physics (doi:10.1063/5.0216389) and explore the full documentation and examples in our GitHub repository.

Invitation to the ESPResSo Summer School 2025

Systematic coarse-graining and machine learning in soft matter physics with ESPResSo

Date:
October 6, 2025 – October 10, 2025

Location:
ICP, University of Stuttgart, Germany (campus 3D map)

Register:
https://www.cecam.org/workshop-detail/1406

Schedule: PDF

Course description

Scientific content

This school will teach coarse-graining[10], reverse coarse-graining, chemical space exploration[11], machine learning descriptors, machine-learned effective potentials, reinforcement learning, soft matter physics, and lattice-Boltzmann hydrodynamics.

Lectures will provide an introduction to the physics and model building of these systems as well as an overview of the necessary simulation algorithms. During the afternoon, participants will practice running their own simulations in tutored hands-on sessions using the software ESPResSo[1]. Many of the lectures and hands-on sessions will be taught by developers of the software. Hence, the school will also provide a platform for discussion between developers and users about the future of the software used in the hands-on sessions. Moreover, users can get advice on their specific simulation projects. Time will also be dedicated to research talks, which illustrate how the simulation models and software are applied, and which provide further background on simulating soft matter at different length and time scales.
Poster session

Poster session

You have the opportunity to bring a poster to introduce your work to your peers. We welcome abstract submissions on both planned and ongoing research projects, done with or without ESPResSo, as long as they fit to the general themes of this event. The abstract should contain at most 400 words without counting the bibliography, and not have been published elsewhere.

Everyone bringing a poster is invited to present it in a 1 minute lightning talk during the poster session. The poster boards will remain up for the entire duration of the school. Accepted contributions will be published in a book of abstracts under a permissive open-source license on Zenodo.

Invited speakers

  • Philip Loche, EPFL (Switzerland)
  • Denis Andrienko, Max Planck Institute for Polymer Research (Germany)
  • Christoph Junghans, Los Alamos National Laboratory (United States)
  • Simon Olsson, Chalmers University of Technology (Sweden)
  • Julija Zavadlav, Technical University of Munich (Germany)
  • Nico van der Vegt, Technical University of Darmstadt (Germany)
  • Markus Miettinen, University of Bergen (Norway)

Teaching material

Hands-on sessions

We use interactive Jupyter notebooks to teach concrete applications of the simulation methods introduced in the lectures. These notebooks outline physical systems relevant to soft matter physics and sketch simulation scripts written for ESPResSo using the Python language. A few parts of these scripts are hidden and need to be completed by participants, with the help of the ESPResSo user guide and the tutors.

These exercises can also be carried out in self-study after the school via the online platforms Binder and Gitpod, and all exercises have hidden solutions that can be revealed at any time.

Software

In this school, participants learn to conduct and link simulations at different scales by means of systematic coarse-graining and machine learning. The focus will be on coarse-grained models from the broad fields of statistical physics, soft matter and active matter, using the software ESPResSo (espressomd.org). ESPResSo is an open-source particle-based simulation package with a focus on coarse-grained molecular dynamics models. In addition, it offers a wide range of schemes for solving electrostatics, magnetostatics, hydrodynamics and electrokinetics, as well as algorithms for active matter and chemical reactions[1,4]. These methods can be combined to simulate different scales and recover emergent material properties at macroscopic scales. In addition, we can couple ESPResSo to external software to offload calculation of forces using machine-learned potentials, or carry out reinforcement learning to control smart agents in active matter simulations.

ESPResSo consists of an MPI-parallelized simulation core written in C++ and a scripting interface in Python which integrates well with scientific Python packages, such as NumPy, pyMBE[5], pyOIF[6], VOTCA[7], ZnDraw[8] and SwarmRL[9]. ESPResSo relies on waLBerla, a high performance lattice-Boltzmann library, for hydrodynamics and other lattice-based schemes for electrokinetics and related fields[2]. Custom waLBerla kernels can be rapidly prototyped in symbolic form in Python and automatically converted to highly optimized, performance-portable code for CPUs and GPUs[3].

Event organization

This school is planned as an on-site event. We don’t charge a participation fee.

Hands-on sessions will be tutored by experienced ESPResSo users and developers. There will be additional opportunities for scientific exchange during the event: scientific speed dating and BBQ on Monday, poster session on Tuesday, city tour and conference dinner on Thursday.

A preliminary schedule is available as a PDF file in the Documents tab. The school starts on Monday at 9am and ends on Friday at 1pm.

References

[1] F. Weik, R. Weeber, K. Szuttor, K. Breitsprecher, J. de Graaf, M. Kuron, J. Landsgesell, H. Menke, D. Sean, C. Holm, Eur. Phys. J. Spec. Top., 227, 1789-1816 (2019)
[2] M. Bauer, S. Eibl, C. Godenschwager, N. Kohl, M. Kuron, C. Rettinger, F. Schornbaum, C. Schwarzmeier, D. Thönnes, H. Köstler, U. Rüde, Computers & Mathematics with Applications, 81, 478-501 (2021)
[3] M. Bauer, J. Hötzer, D. Ernst, J. Hammer, M. Seiz, H. Hierl, J. Hönig, H. Köstler, G. Wellein, B. Nestler, U. Rüde, Code generation for massively parallel phase-field simulations, 2019
[4] R. Weeber, J. Grad, D. Beyer, P. Blanco, P. Kreissl, A. Reinauer, I. Tischler, P. Košovan, C. Holm, ESPResSo, a Versatile Open-Source Software Package for Simulating Soft Matter Systems, 2024
[5] D. Beyer, P. Torres, S. Pineda, C. Narambuena, J. Grad, P. Košovan, P. Blanco, The Journal of Chemical Physics, 161, (2024)
[6] I. Jančigová, K. Kovalčíková, R. Weeber, I. Cimrák, PLoS. Comput. Biol., 16, e1008249 (2020)
[7] S. Mashayak, M. Jochum, K. Koschke, N. Aluru, V. Rühle, C. Junghans, PLoS. ONE., 10, e0131754 (2015)
[8] R. Elijošius, F. Zills, I. Batatia, S. Norwood, D. Kovács, C. Holm, G. Csányi, Nat. Commun., 16, 5991 (2025)
[9] S. Tovey, C. Lohrmann, T. Merkt, D. Zimmer, K. Nikolaou, S. Koppenhöfer, A. Bushmakina, J. Scheunemann, C. Holm, Eur. Phys. J. E, 48, 16 (2025)
[10] C. Scherer, R. Scheid, D. Andrienko, T. Bereau, J. Chem. Theory Comput., 16, 3194-3204 (2020)
[11] R. Menichetti, K. Kanekal, T. Bereau, ACS Cent. Sci., 5, 290-298 (2019)

pyMBE 0.8.0 released

We are happy to announce the first release of pyMBE (doi:10.5281/zenodo.12102635), an open-source Python package designed to facilitate the design of custom coarse-grained models of polyelectrolytes, peptides and proteins in ESPResSo. pyMBE extends the ESPResSo API with methods to automate repetitive and error-prone tasks, such as setting up chemical bonds, non-bonded interactions and reaction methods.

pyMBE is maintained by an active community of soft matter researchers with a shared interest in the modeling of weak polyelectrolytes and biomacromolecules. We welcome new users and developers to join the project and contribute new features!

Learn more about pyMBE in our recent publication at The Journal of Chemical Physics (doi:10.1063/5.0216389), where we outline the main features of pyMBE and show how it can be leveraged in computational soft matter research.

ESPResSo 4.2.2 released

This release provides a number of corrections for the ESPResSo 4.2 line. We recommend that this release be used for all production simulations. The interface has not been changed between ESPResSo 4.2.1 and 4.2.2. However, some bugs were discovered which can affect simulation results.

Please find the list of changes below. The numbers in brackets refer to ticket numbers on https://github.com/espressomd/espresso

Get the source code in the Download area.

Improved documentation

  • Installation instructions now mention the FFTW3 MPI dependency of long-range solvers and provide recommended version numbers for Jupyter Notebook dependencies (#4790).
  • Installation instructions now mention Python environments (#4922).
  • Observables not properly document return values, array shapes, and use a more consistent mathematical notation (#4898).

Bug fixes

  • Fatal runtime errors due to MPI global variables lifetime were addressed (#4858). Older ESPResSo releases built with Boost 1.84 or later might randomly crash when exiting the Python interpreter.
  • Virtual sites no longer contribute to the kinetic energy of the system (#4839). The regression was introduced in April 2021 and affected the 4.2 branch of ESPResSo.
  • Inertialess tracers are now integrated along the z-axis (#4714). The regression was introduced in February 2022 and affected the 4.2 branch of ESPResSo.
  • Inertialess tracers now throw an exception when attempting to use LB GPU with 2 or more MPI ranks (#4714). Before, tracers on non-root MPI ranks would be silently ignored by the CUDA kernels, and would have a constant velocity, either 0 if the particle never visited the fluid domain on the root rank, or the last known velocity if the particle was once on the root rank. This bug affected all ESPResSo versions.
  • Particles close to the faces of the simulation box are now properly coupled to the LB fluid (#4827). Due to numerical instability, it was previously possible for particles to be outside the box simulation by a tiny amount and skip LB particle coupling. The probability of this bug occurring was low, but could be enhanced in simulations that purposefully placed particle near the faces of the simulation box: polymers sheared by Lees-Edwards boundary conditions, raspberry particles (colloids, bacteria, etc.) when crossing a periodic boundary, or cell membranes placed close to a periodic boundary.
  • Resizing the box now throws a runtime error if there are constraints present (#4778), since constraint preconditions might no longer be fulfilled. For example, a wall constraint might end up outside the box boundaries when the box shrinks.
  • Resizing the box via system.box_l = new_box_l now throws a runtime error if there are particles present, because particle position folding cannot be guaranteed to be correct (#4901); use system.change_volume_and_rescale_particles() instead, which properly rescales particle positions.
  • The velocity Verlet NpT propagator doesn’t apply friction and noise on angular velocities. ESPResSo now throws an error when NpT encounters a rotating particle (#4843). This bug affected all ESPResSo versions.
  • The Brownian thermostat can no longer be configured with act_on_virtual=True due to an unresolved bug (#4295) that will be addressed in the next minor release.
  • Restrictions on the number of MPI ranks have been lifted from the checkpointing mechanism (#4724). It is now possible to use checkpointing again in MPI-parallel simulations when the system contains LB boundaries or Union shape-based constraints. These restrictions had been introduced in 4.2.0 for technical reasons that have since been resolved.
  • When passing an invalid value to a function that expects an input parameter of type list of size 3, an exception is now raised (#4911). Previously, some functions would print an error message and continue their execution with uninitialized data.
  • The per-type and per-mol_id contributions from system.analysis.energy(), system.analysis.pressure() and system.analysis.pressure_tensor() now return the correct values (#4788). Older version of ESPResSo were confusing the particle mol_id with the particle type. The total pressure was unreliable when mol_id properties were set to non-zero values.
  • The OpenGL visualizer now extracts the correct non-bonded potential parameter sigma when feature WCA is compiled in but LENNARD_JONES isn’t (#4720). The regression was introduced in 4.2.1.
  • Method OifCell.elastic_forces() no longer throws a TypeError (#4813).
  • Benchmark scripts were adjusted to support large particle numbers (#4753).

Under the hood changes

  • Several Clang 16 and GCC 13 compiler diagnostics have been addressed (#4715).
  • A non-critical GCC C++20 deprecation warning in Cython-generated code was disabled (#4725).
  • Several deprecation warnings emitted by CMake 3.27 have been silenced (#4792).
  • Add support for setuptools version 67.3.0 and above (#4709).
  • Add support for Python 3.12 in testsuites run by CTest (#4852).
  • Python requirements have been updated (#4924).
  • CI pipeline URLs have been fixed (#4736).

Invitation to the ESPResSo Summer School 2024

Simulating soft matter across scales

Date:
October 7, 2024 – October 11, 2024

Location:
ICP, University of Stuttgart (Germany)

Register:
https://www.cecam.org/workshop-detail/1324

Schedule: PDF

Course description

Scientific content

This school will teach coarse-grained and lattice-based simulations methods suitable for modeling soft matter systems at mesoscopic length and time scales. We will explore topics such as simulating coarse-grained ionic liquids in electrolytic capacitors to measure differential capacitance, simulating coarse-grained liquids with machine-learned effective potentials to match the properties of models with atomistic resolution, polymer diffusion in an implicit solvent, particle coupling to continuum hydrodynamic fields, and diffusion-advection-reaction solvers for electrokinetics and catalysis.

Lectures will provide an introduction to the physics and model building of these systems as well as an overview of the necessary simulation algorithms. During the afternoon, participants will practice running their own simulations in tutored hands-on sessions using the software ESPResSo[1] and waLBerla[3]. Many of the lectures and hands-on sessions will be taught by developers of the software. Hence, the school will also provide a platform for discussion between developers and users about the future of the software used in the hands-on sessions. Moreover, users can get advice on their specific simulation projects. Time will also be dedicated to research talks, which illustrate how the simulation models and software are applied, and which provide further background on soft matter at different length and time scales.

Poster session

As an on-site participant, you have the opportunity to bring a poster to introduce your work to your peers. We welcome abstract submissions on both planned and ongoing research projects, done with or without ESPResSo/waLBerla, as long as they fit to the general themes of this event. The abstract should contain at most 400 words without counting the bibliography, and not have been published elsewhere.

Everyone bringing a poster is invited to present it in a 1 minute lightning talk during the poster session. The poster boards will remain up for the entire duration of the school. Accepted contributions will be published in a book of abstracts under a permissive open-source license on Zenodo.

Invited speakers

  • Timm Krüger, University of Edinburgh (United Kingdom)
  • Tristan Bereau, University Heidelberg (Germany)
  • Christine Peter, University of Konstanz (Germany)
  • Frederik Hennig, University of Erlangen–Nuremberg (Germany)
  • Matej Praprotnik, National Institute of Chemistry (Slovenia)
  • Pablo M. Blanco, Norwegian University of Science and Technology (Norway)

Teaching material

Hands-on sessions

We use interactive Jupyter notebooks to teach concrete applications of the simulation methods introduced in the lectures. These notebooks outline physical systems relevant to soft matter physics and sketch simulation scripts written for the ESPResSo and waLBerla packages using the Python language. A few parts of these scripts are hidden and need to be completed by participants, with the help of the ESPResSo and waLBerla user guides and the tutors.

These exercises can also be carried out in self-study after the school via the online platforms Binder and Gitpod, and all exercises have hidden solutions that can be revealed at any time.

Software

In this school, participants learn to conduct coarse-grained and lattice-based simulations suitable for modeling soft matter systems using the software ESPResSo and waLBerla. ESPResSo is an open-source particle-based simulation package with a focus on coarse-grained molecular dynamics models. In addition, it offers a wide range of schemes for solving electrostatics, magnetostatics, hydrodynamics and electrokinetics, as well as algorithms for active matter and chemical reactions[2]. These methods can be combined to simulate different scales and recover emergent material properties at macroscopic scales.

ESPResSo consists of an MPI-parallelized simulation core written in C++ and a scripting interface in Python which integrates well with scientific Python packages, such as numpy, pyMBE[5], pyOIF[6], ZnDraw[7] and SwarmRL[8]. ESPResSo relies on waLBerla, a high performance lattice-Boltzmann library, for hydrodynamics and other lattice-based schemes for electrokinetics and related fields[3]. Custom waLBerla kernels can be rapidly prototyped in symbolic form in Python and automatically converted to highly optimized, performance-portable code for CPUs and GPUs[4].

Event organization

This school is planned as an on-site event. We don’t charge a participation fee.

Hands-on sessions will be tutored by experienced ESPResSo/waLBerla users and developers. There will be additional opportunities for scientific exchange during the event: scientific speed dating and BBQ on Monday, poster session on Tuesday, city tour and conference dinner on Thursday.

A preliminary schedule is available as a PDF file in the Documents tab. A more detailed version will be made available in the summer, once all speakers have confirmed their time slots. The school starts on Monday at 9am and ends on Friday at 1pm.

References

[1] R. Weeber, J. Grad, D. Beyer, P. Blanco, P. Kreissl, A. Reinauer, I. Tischler, P. Košovan, C. Holm, ESPResSo, a Versatile Open-Source Software Package for Simulating Soft Matter Systems, 2023
[2] F. Weik, R. Weeber, K. Szuttor, K. Breitsprecher, J. de Graaf, M. Kuron, J. Landsgesell, H. Menke, D. Sean, C. Holm, Eur. Phys. J. Spec. Top., 227, 1789-1816 (2019)
[3] M. Bauer, S. Eibl, C. Godenschwager, N. Kohl, M. Kuron, C. Rettinger, F. Schornbaum, C. Schwarzmeier, D. Thönnes, H. Köstler, U. Rüde, Computers & Mathematics with Applications, 81, 478-501 (2021)
[4] M. Bauer, J. Hötzer, D. Ernst, J. Hammer, M. Seiz, H. Hierl, J. Hönig, H. Köstler, G. Wellein, B. Nestler, U. Rüde, Code generation for massively parallel phase-field simulations, 2019
[5] D. Beyer, P. B. Torres, S. P. Pineda, C. F. Narambuena, J.-N. Grad, P. Košovan and P. M. Blanco, 2024, arXiv:2401.14954 [cond-mat.soft]
[6] I. Jančigová, K. Kovalčíková, R. Weeber, I. Cimrák, PLoS. Comput. Biol., 16, e1008249 (2020)
[7] R. Elijošius, F. Zills, I. Batatia, S. W. Norwood, D. P. Kovács, C. Holm and G. Csányi, 2024, arXiv:2402.08708 [physics.chem-ph]
[8] S. Tovey, C. Lohrmann, T. Merkt, D. Zimmer, K. Nikolaou, S. Koppenhöfer, A. Bushmakina, J. Scheunemann, C. Holm, 2024, arXiv:2404.16388 [cs.RO]

Invitation to the ESPResSo Summer School 2023

Simulating energy materials with ESPResSo and waLBerla

Date:
October 9, 2023 – October 13, 2023

Location:
hybrid format: onsite course at the ICP, University of Stuttgart (Germany) with live streaming on Zoom for online participants

Register:
https://www.cecam.org/workshop-detail/1229

Schedule: PDF, iCalendar

Course description

Scientific content

This school will teach the physics and simulation methods used to study energy materials. We will explore topics such as electrostatics in confinement, chemical reactions and catalysis, electrophoretic mobility, diffusion, and electrokinetics.

We will first introduce particle-based approaches and Monte Carlo schemes to model reactions in chemical systems. Then, we will cover the lattice-Boltzmann method for hydrodynamic interactions and a diffusion-advection-reaction solver for modelling electrokinetics and catalysis.

Lectures will provide an introduction to the physics and model building of these systems as well as an overview of the necessary simulation algorithms. During the afternoon, students will practice running their own simulations in hands-on sessions.

Many of the lectures and hands-on sessions will be taught by developers of the software. Hence, the school will also provide a platform for discussion between developers and users about the future of the software. Moreover, users can get advice on their specific simulation projects. Time will also be dedicated to research talks, which illustrate how the simulation software is applied, and which provide further background in the physics of energy materials.

Poster session

As an on-site participant, you have the opportunity to bring a poster to introduce your work to your peers. We welcome submissions on both planned and ongoing research projects, done with or without ESPResSo, as long as they fit to the general themes of energy materials, fluid dynamics or soft matter physics.

Everyone bringing a poster is invited to present it in a 1 minute lightning talk during the poster session. The poster boards will remain up for the entire duration of the school.

Invited speakers

  • Stephan Gekle (Universität Bayreuth, Germany)
  • Timo Jacob (Ulm University, Germany)
  • Laura Scalfi (Freie Universität Berlin, Germany)
  • Peter Košovan (Charles University, Prague, Czech Republic)
  • Céline Merlet (Toulouse III – Paul Sabatier University, France)
  • Mathieu Salanne (Sorbonne University, Paris, France)
  • Svyatoslav Kondrat (Institute of Physical Chemistry, Warsaw, Poland)

Teaching material

Hands-on sessions

We use interactive Jupyter notebooks to teach concrete applications of the simulation methods introduced in the lectures. These notebooks outline physical systems relevant to soft matter physics and sketch simulation scripts written for the ESPResSo package using the Python language. A few parts of these scripts are hidden and need to be completed by participants, with the help of the ESPResSo user guide and tutors.

We offer tutoring to all on-site participants and to a small number of online participants via Zoom. These exercises can also be carried out in self-study using the web browser via Binder or Gitpod, and all exercises have hidden solutions that can be revealed at any time.

Software

In this school, students learn to conduct coarse-grained and lattice-based simulations suitable for modeling energy materials, but which can easily be transferred to other fields of statistical physics and soft matter physics, using the software ESPResSo (espressomd.org) and waLBerla (walberla.net). ESPResSo is an open-source particle-based simulation package with a focus on coarse-grained molecular dynamics models. In addition, it offers a wide range of schemes for solving electrostatics, magnetostatics, hydrodynamics and electrokinetics, as well as algorithms for active matter and chemical reactions[1].

ESPResSo consists of an MPI-parallelized simulation core written in C++ and a scripting interface in Python which integrates well with science and visualization Python packages, such as numpy and PyOpenGL. ESPResSo relies on waLBerla, a high performance lattice-Boltzmann library, for hydrodynamics and other lattice-based schemes for electrokinetics and related fields[2].

Event organization

This school is primarily planned as an on-site event. Lectures and talks will be streamed live on Zoom for online participants. Hands-on sessions will be tutored by experienced ESPResSo users and developers. There will be additional opportunities for scientific exchange during the user & developer meeting, poster session, Q&A sessions and social events (scientific speed dating, BBQ, city tour, speakers’ dinner).

Attendance to the summer school is free.

References

[1] F. Weik, R. Weeber, K. Szuttor, K. Breitsprecher, J. de Graaf, M. Kuron, J. Landsgesell, H. Menke, D. Sean, C. Holm, Eur. Phys. J. Spec. Top., 227, 1789-1816 (2019) DOI:10.1140/epjst/e2019-800186-9
[2] M. Bauer, S. Eibl, C. Godenschwager, N. Kohl, M. Kuron, C. Rettinger, F. Schornbaum, C. Schwarzmeier, D. Thönnes, H. Köstler, U. Rüde, Computers & Mathematics with Applications, 81, 478-501 (2021) DOI:10.1016/j.camwa.2020.01.007
[3] M. Bauer, J. Hötzer, D. Ernst, J. Hammer, M. Seiz, H. Hierl, J. Hönig, H. Köstler, G. Wellein, B. Nestler, U. Rüde, Code generation for massively parallel phase-field simulations. Published in: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (2019) DOI:10.1145/3295500.3356186

ESPResSo 4.2.1 released

This release provides a number of corrections for the ESPResSo 4.2 line. We recommend that this release be used for all production simulations. The interface has not been changed between ESPResSo 4.2.0 and 4.2.1. However, some bugs were discovered which can affect simulation results.

We recommend that this release be used for all production simulations. No further bug fix releases will be provided for the 4.2 line.

Please find the list of changes below. The numbers in brackets refer to ticket numbers on https://github.com/espressomd/espresso

Get the source code in the Download area.

Added functionality

  • P3M and DipolarP3M can now be used with the hybrid decomposition cell system with 1 MPI rank (#4678).
  • Lattice-Boltzmann can now be used with the N-square and hybrid decomposition cell systems with 2 or more MPI ranks (#4676).

Changed requirements

  • The nbconvert version requirement was bumped to 6.5.1 to patch an XSS vulnerability (#4658).

Improved documentation

  • The user guide now documents how to improve the reproducibility of simulations that have checkpointing enabled (#4677).
  • The user guide now reflects that the lattice-Boltzmann profile observables can be used in parallel (#4583).
  • The active matter tutorial now uses an adequate engine dipole for the swimmer particle (#4585).
  • The error analysis tutorials have been improved (#4597).
  • The tutorials can now be used in VS Code Jupyter (both the desktop and web versions) and the mathematical formula are now correctly displayed (#4531).
  • All ESPResSo-specific CMake options are now documented in the installation chapter of the user guide (#4608).
  • Python package installation instructions no longer feature package version numbers; instead, requirements.txt is used as a constraint file (#4638).
  • MMM1D algorithms now properly document their parameter names (#4677).
  • Reaction methods now cite the relevant literature (#4681).
  • Caveats for chain analysis methods are now documented (#4698).
  • Minor formatting issues in Sphinx and typos in Python docstrings were addressed (#4608).

Interface changes

  • A new boolean property System.virtual_sites.override_cutoff_check was introduced to allow disabling the cutoff range checks from virtual sites (#4623).

Removed functionality

  • The unused and untested Analysis.v_kappa() method was removed (#4534).

Improved testing

  • Improve unit testing of core functionality: P3M, MMM1D, OIF, virtual sites, script interface factory (#4631).

Bug fixes

  • The checkpointing mechanism now properly restores the particle quaternion and all derived quantities (#4637). Release 4.2.0 introduced a regression that caused checkpoint files to overwrite the particle quaternion/director by a unit vector pointing along the z direction, when the DIPOLES feature was part of the myconfig file. This lead to incorrect trajectories when reloading a simulation from a checkpoint file, if the particle director played a role in the simulation (ex: relative virtual sites, Gay-Berne potential, anisotropic particles, active particles, etc.). In addition, the angular velocity in body frame was restored with the wrong orientation. Since the default myconfig file contains DIPOLES, most ESPResSo users were affected.
  • The checkpointing mechanism now properly restores LB boundaries (#4649). Release 4.2.0 introduced a regression where reloading LB populations would accidentally reset LB boundary flags.
  • The checkpointing mechanism now restores P3M and DipolarP3M solvers without triggering a re-tune (#4677). In previous releases, the checkpointing code would automatically re-tune these algorithms during a reload, causing tiny deviations in the forces that were problematic for trajectory reproducibility.
  • Brownian dynamics now integrates the rotational dynamics of rotatable particles whose position is fixed in 3D space (#4548).
  • Langevin dynamics now properly integrates particles with anisotropic friction (#4683, #4690).
  • A regression that caused virtual sites to incorrectly count their image box when crossing a periodic boundary has been fixed (#4564, #4707).
  • Particles can no longer be created or updated with a negative mass or a null mass (#4679).
  • Particles created without a user-specified type can now participate in reactions (#4589).
  • When a Monte Carlo displacement move is rejected, the original particle velocity is now restored (#4589).
  • Reaction methods now raise an exception when accidentally calling method.reaction(steps=20) instead of method.reaction(reaction_steps=20) (#4666). Since 4.2.0 the steps argument was ignored, in which case the default value reaction_steps=1 would used by the core. Note that in the next minor release of ESPResSo, the reaction_steps argument will be renamed to steps.
  • Reaction methods now rebuild the list of free particle ids every time WidomInsertion::calculate_particle_insertion_potential_energy() and ReactionAlgorithm::do_reaction() are called (#4609). This was needed to allow multiple concurrent reactions, as well as avoiding subtle bugs when both the user and a reaction method tried to create a new particle with an id that used to belong to a deleted particle.
  • When all particles are cleared, the reaction methods type map is now also cleared (#4645). In the past, it was possible to attempt a reaction on particles that had just been cleared from the system, which would raise an exception. This bug affected all ESPResSo releases since 4.0.
  • The System.part.pairs() method now returns the correct particle pairs when particle ids aren’t both contiguous and starting from 0 (#4628). The regression was introduced in release 4.2.0.
  • The auto-exclusions feature no longer adds spurious exclusions to particle ids in the range 1, distance. This bug would potentially break the physics of the system and potentially raise an exception in a system with non-contiguous particle ids. This regression was introduced in release 2.2.0b.
  • The structure factor analysis code no longer double-counts particles when the same particle type is provided twice (#4534).
  • The minimal distance distribution analysis code no longer has an arbitrary cutoff distance when the simulation box is aperiodic (open boundaries); this would cause spurious artifacts to appear in the histogram at r = np.sum(system.box_l) when particles were further apart than this arbitrary distance (#4534).
  • The cluster analysis functions are now disabled for systems with Lees-Edwards periodic boundaries, since the cluster analysis position wrapping code doesn’t properly handle the shear offset (#4698).
  • The chain analysis methods now raise an error when the number of chains or beads per chain is invalid (#4708).
  • The observable tests now longer rely on deprecated numpy options that were removed in numpy 1.24 (#4635).
  • The visualizer *_arrows_type_materials options now have an effect on arrow materials (#4686).
  • The visualizer exception handling mechanism has been made less brittle (#4686).
  • The visualizer no longer raises exception when the optional dependency freeglut isn’t installed (#4691).
  • The visualizer can randomly freeze when using collision detection or bond breakage; a temporary workaround has been introduced that fixes the issue for simulations that use only 1 MPI rank (#4686).
  • The __dir__() method of script interface objects no longer raises an exception (#4674).
  • Compilation and testsuite issues involving missing or incorrect feature guards were addressed (#4562, #4648).
  • The build system no longer silently ignores invalid external feature definitions in myconfig.hpp and CMake files (#4608). This issue would only affect feature developers, as well as users of very old compilers, and would lead to ESPResSo builds missing features.

Under the hood changes

  • The Clang 14 and AppleClang 14 compilers are now supported (#4601).
  • Several Clang 14 compiler diagnostics have been addressed (#4606).
  • Boost 1.81 and later versions are now supported (#4655).
  • Compiler errors on non-x86 architectures were addressed (#4538).
  • Test tolerances were adjusted for non-x86 architectures (#4708).
  • The pypresso script now prints a warning when running with MCA binding policy “numa” on NUMA architectures that are not supported in singleton mode by Open MPI 4.x (#4607).
  • The config file generator has been rewritten to properly handle external features and compiler errors (#4608).
  • Security hardening for GitHub Workflows (#4577, #4638) and Codecov (#4600).
  • Deployment of the user guide to GitHub Pages now relies on cloud providers to fetch JavaScript dependencies (#4656).

Job Posting: Research Software Engineer in Molecular dynamics and lattice-Boltzmann

The Institute for Computational Physics at the University of Stuttgart is looking for a research software engineer to work on our open source simulation package ESPResSo.

Your tasks

  • Coupling of particle-based algorithms like molecular dynamics to lattice-based ones such as lattice-Boltzmann
  • Off-loading of parts of the computation to GPUs using CUDA
  • Performance engineering, in particular with respect to parallelism and Monte Carlo methods
  • Occasional contributions to other packages, e.g., the lattice-Boltzmann software Walberla/PyStencils/LbmPy used by ESPResSo for lattice-Boltzmann and diffusion-advection-reaction simulations
  • Contributing to the maintenance of the molecular dynamics software ESPResSo, its documentation, and the continuous integration tooling

Your qualifications

  • A strong interest in scientific software development and simulations
  • An M.Sc. or Ph.D. in physics, computer science, simulation technology or a related discipline
  • Proven experience in C++, experience in CUDA and Python are an asset
  • Proven experience in numerical work such as simulations
  • The willingness to engage with an interdisciplinary user and developer community
  • The ability to pursue complex projects both, in teams and independently

What we offer

  • A 12 to 18 months full time position (EG TV-L 13 with 39.5 hours/week)
  • An exciting and friendly working environment
  • Interesting and challenging development projects
  • A well established CI/CD process including, e.g., automated testing and code review is in place
  • Visibility of your work, as ESPResSo is an open-source project
  • Frequent interactions with users of the software and the ability to foster your international network
  • Ample opportunities for skill development, including e.g., training by the Stuttgart High Performance Computing Center (HLRS)
  • Excellent compute resources

To apply

Please send your cover letter, CV and contacts for two references to application@icp.uni-stuttgart.de until May 15, 2023. If you have contributed to publicly hosted projects, please include links to your GitHub page or similar.

Diversity

At the University of Stuttgart, we actively promote diversity among our employees. We have set ourselves the goal of recruiting more female scientists and employing more people with an international background, as well as people with disabilities. We are therefore particularly pleased to receive applications from such people. Regardless, we welcome any good application.

Women who apply will be given preferential consideration in areas in which they are underrepresented, provided they have the same aptitude, qualifications and professional performance. Severely disabled applicants with equal qualifications will be given priority.

As a certified family-friendly university, we support the compatibility of work and family, and of professional and private life in general, through various flexible modules. We have an employee health management system that has won several awards and offer our employees a wide range of continuing education programs. We are consistently improving our accessibility. Our Welcome Center helps international scientists get started in Stuttgart. We support partners of new professors and managers with a dual-career program.

Information in accordance with Article 13 DS-GVO on the processing of applicant data can be found in German at https://careers.uni-stuttgart.de/content/Datenschutz/?locale=de_DE

Invitation to the ESPResSo Summer School 2022

Simulating the dynamics of soft matter with ESPResSo, PyStencils and LbmPy

Date:
October 10, 2022 – October 14, 2022

Location:
hybrid format: onsite course at the ICP, University of Stuttgart (Germany) with live streaming on Zoom for online participants

Register:
https://www.cecam.org/workshop-detail/1146

Schedule: PDF, iCal

Notes from the Organizers

This school is currently planned as an onsite event. If the Covid-rules applicable in autumn do not allow it, we will adjust the format of the school to allow remote attendance.

Course description

In this school, students learn to conduct simulations in the fields of statistical physics, soft matter and active matter using the software ESPResSo. It is an open-source particle-based simulation package with a focus on coarse-grained molecular dynamics models. In addition, it offers a wide range of schemes for solving electrostatics, magnetostatics, hydrodynamics and electrokinetics, as well as algorithms for active matter and chemical reactions[1].

ESPResSo consists of an MPI-parallelized simulation core written in C++ and a scripting interface in Python which integrates well with science and visualization Python packages, such as numpy and PyOpenGL. ESPResSo relies on waLBerla, a high performance lattice-Boltzmann library, for hydrodynamics and other lattice-based schemes for electrokinetics and related fields[2].

In this school, after an introduction to particle-based simulations and the software interface, we will focus on the dynamics of soft matter. We will explore topics such as electrophoretic mobility of colloids, diffusion of polymers, and studying rheology using Lees-Edwards boundary conditions. In addition to particle-based approaches, we will cover the lattice-Boltzmann method for hydrodynamic interactions and a diffusion-advection-reaction solver for modelling electrokinetics and catalysis.  Lectures will provide an introduction to the physics and simulation model building as well as an overview of the necessary simulation algorithms. During the afternoon, students will practice running their own simulations in hands-on sessions.

Many of the lectures and hands-on sessions will be taught by developers of the software. Hence, the school will also provide a platform for discussion between developers and users about the future of the software. Moreover, users can get advice on their specific simulation projects. Time will also be dedicated to research talks, which illustrate how the simulation software is applied, and which provide further background in the field of soft matter dynamics.

Attendance to the summer school is free.

[1] F. Weik, R. Weeber, K. Szuttor, K. Breitsprecher, J. de Graaf, M. Kuron, J. Landsgesell, H. Menke, D. Sean, C. Holm, Eur. Phys. J. Spec. Top., 227, 1789-1816 (2019) DOI:10.1140/epjst/e2019-800186-9
[2] M. Bauer, S. Eibl, C. Godenschwager, N. Kohl, M. Kuron, C. Rettinger, F. Schornbaum, C. Schwarzmeier, D. Thönnes, H. Köstler, U. Rüde, Computers & Mathematics with Applications, 81, 478-501 (2021) DOI:10.1016/j.camwa.2020.01.007
[3] M. Bauer, J. Hötzer, D. Ernst, J. Hammer, M. Seiz, H. Hierl, J. Hönig, H. Köstler, G. Wellein, B. Nestler, U. Rüde, Code generation for massively parallel phase-field simulations. Published in: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis (2019) DOI:10.1145/3295500.3356186