Skip to content

Workflow execution

Local execution

The following script is used to run a relaxation calculation locally. Set species_dir to point to the FHI-aims species_defaults directory.

si_relax.py:

#!/usr/bin/env python

from pymatgen.core import Structure, Lattice
from pymatgen.io.aims.sets.core import RelaxSetGenerator
from atomate2.aims.jobs.core import RelaxMaker
from jobflow import run_locally


a = 2.715
lattice = Lattice([[0.0, a, a], [a, 0.0, a], [a, a, 0.0]])
si = Structure(
    lattice=lattice,
    species=["Si", "Si"],
    coords=[[0, 0, 0], [0.25, 0.25, 0.25]],
)

# Create relax job
relax_job = RelaxMaker(
    input_set_generator=RelaxSetGenerator(
        user_params={"species_dir": "light", "k_grid": [7, 7, 7]}
    )
).make(si)

# Run relax job locally
j_id = run_locally(relax_job)

In your terminal, run:

python si_relax.py

Once the calculation is complete, j_id will hold all the output information parsed from aims.out. This can be accessed from the MongoDB database atomate2.

Remote execution

The python code, si_relax_remote.py is used to run a FHI-aims relaxation calculation on the remote server, requested from the local computer.

si_relax_remote.py:

#!/usr/bin/env python

from pymatgen.core import Structure, Lattice
from pymatgen.io.aims.sets.core import RelaxSetGenerator
from atomate2.aims.jobs.core import RelaxMaker
from jobflow_remote import submit_flow


a = 2.715
lattice = Lattice([[0.0, a, a], [a, 0.0, a], [a, a, 0.0]])
si = Structure(
    lattice=lattice,
    species=["Si", "Si"],
    coords=[[0, 0, 0], [0.25, 0.25, 0.25]],
)

# Create relax job
relax_job = RelaxMaker(
    input_set_generator=RelaxSetGenerator(
        user_params={"species_dir": "light", "k_grid": [7, 7, 7]}
    )
).make(si)

resource = {"nodes": 4, "ntasks_per_node": 4, "partition": "small"}

# Run relax job remotely
j_id = submit_flow(relax_job, project="timewarp", resources=resource)

On your local computer run,

python si_relax_remote.py 

You can monitor the job status with the command jf job list from your local computer. You may submit as many calculations as you want since the job scheduler on the remote cluster will take care of queuing jobs and executing them. Once the calculation is complete, the output data can be accessed in the MongoDB database timewarp. Each job is saved with a unique identifier, uuid which looks something like uuid:"71411da5-078c-4f88-8362-9da3ae3263ea".

Importing MongoDB data in Python

The calculation data is saved in the database timewarp in the collection outputs. Each run is a document within the collection and can be referenced by its uuid using Python. The following script displays the structural information and bandgap of the material.

from jobflow_remote import get_jobstore

js = get_jobstore()
js.connect()

data = js.get_output("71411da5-078c-4f88-8362-9da3ae3263ea")

# output structural information
print(data["structure"])

# output bandgap 
print(data["output"]["bandgap"])

Next steps

What we just did was a simple structural relaxation calculation. However, the beauty of Atomate2 is the ability to chain calculations to do workflows. For instance, a relaxation followed by a bandstructure calculation. Please refer to the Atomate2 documentation for guidance on how to do this and much more.