pyiron.atomistics.job package¶
Submodules¶
pyiron.atomistics.job.atomistic module¶
-
class
pyiron.atomistics.job.atomistic.
AtomisticGenericJob
(project, job_name)[source]¶ Bases:
pyiron.base.job.generic.GenericJob
Atomistic Generic Job class extends the Generic Job class with all the functionality to run jobs containing atomistic structures. From this class all specific atomistic Hamiltonians are derived. Therefore it should contain the properties/routines common to all atomistic jobs. The functions in this module should be as generic as possible.
Parameters: - project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in
- job_name (str) – name of the job, which has to be unique within the project
-
.. attribute:: job_name
name of the job, which has to be unique within the project
-
.. attribute:: status
- execution status of the job, can be one of the following [initialized, appended, created, submitted, running,
- aborted, collect, suspended, refresh, busy, finished]
-
.. attribute:: job_id
unique id to identify the job in the pyiron database
-
.. attribute:: parent_id
job id of the predecessor job - the job which was executed before the current one in the current job series
-
.. attribute:: master_id
job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.
-
.. attribute:: child_ids
list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master
-
.. attribute:: project
Project instance the jobs is located in
-
.. attribute:: project_hdf5
ProjectHDFio instance which points to the HDF5 file the job is stored in
-
.. attribute:: job_info_str
short string to describe the job by it is job_name and job ID - mainly used for logging
-
.. attribute:: working_directory
working directory of the job is executed in - outside the HDF5 file
-
.. attribute:: path
path to the job as a combination of absolute file system path and path within the HDF5 file.
-
.. attribute:: version
Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.
-
.. attribute:: executable
Executable used to run the job - usually the path to an external executable.
-
.. attribute:: library_activated
For job types which offer a Python library pyiron can use the python library instead of an external executable.
-
.. attribute:: server
Server object to handle the execution environment for the job.
-
.. attribute:: queue_id
the ID returned from the queuing system - it is most likely not the same as the job ID.
-
.. attribute:: logger
logger object to monitor the external execution and internal pyiron warnings.
-
.. attribute:: restart_file_list
list of files which are used to restart the calculation from these files.
-
.. attribute:: job_type
- Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’, ‘ScriptJob’,
- ‘ListMaster’]
-
animate_structure
(spacefill=True, show_cell=True, stride=1, center_of_mass=False, particle_size=0.5)[source]¶ Animates the job if a trajectory is present
Parameters: - spacefill (bool) –
- show_cell (bool) –
- stride (int) –
show animation every stride [::stride] use value >1 to make animation faster
default=1 - center_of_mass (bool) –
Returns: nglview IPython widget
Return type: animation
-
calc_md
(temperature=None, pressure=None, n_ionic_steps=1000, time_step=None, n_print=100, temperature_damping_timescale=100.0, pressure_damping_timescale=None, seed=None, tloop=None, initial_temperature=True, langevin=False)[source]¶
-
calc_minimize
(e_tol=1e-08, f_tol=1e-08, max_iter=1000, pressure=None, n_print=1)[source]¶ Parameters: - e_tol –
- f_tol –
- max_iter –
- pressure –
- n_print –
Returns:
-
continue_with_final_structure
(job_type=None, job_name=None)[source]¶ Parameters: - job_type –
- job_name –
Returns:
-
continue_with_restart_files
(job_type=None, job_name=None)[source]¶ Parameters: - job_type –
- job_name –
Returns:
-
copy_to
(project=None, new_job_name=None, input_only=False, new_database_entry=True)[source]¶ Parameters: - destination –
- new_job_name –
- input_only –
- new_database_entry –
Returns:
-
from_hdf
(hdf=None, group_name=None)[source]¶ Recreates instance from the hdf5 file :param hdf: Path to the hdf5 file :type hdf: str :param group_name: Name of the group which contains the object :type group_name: str
-
get_structure
(iteration_step=-1)[source]¶ Gets the structure from a given iteration step of the simulation (MD/ionic relaxation). For static calculations there is only one ionic iteration step :param iteration_step: Step for which the structure is requested :type iteration_step: int
Returns: The required structure Return type: pyiron.atomistics.structure.atoms.Atoms
-
restart
(snapshot=-1, job_name=None, job_type=None)[source]¶ Restart a new job created from an existing calculation. :param project: Project instance at which the new job should be created :type project: pyiron.project.Project instance :param snapshot: Snapshot of the calculations which would be the initial structure of the new job :type snapshot: int :param job_name: Job name :type job_name: str :param job_type: Job type :type job_type: str
Returns: New job Return type: new_ham
-
set_kpoints
(mesh=None, scheme='MP', center_shift=None, symmetry_reduction=True, manual_kpoints=None, weights=None, reciprocal=True)[source]¶ Parameters: - mesh –
- scheme –
- center_shift –
- symmetry_reduction –
- manual_kpoints –
- weights –
- reciprocal –
Returns:
-
structure
¶ type: Returns
-
to_hdf
(hdf=None, group_name=None)[source]¶ Store the GenericJob in an HDF5 file
Parameters: - hdf (ProjectHDFio) – HDF5 group object - optional
- group_name (str) – HDF5 subgroup name - optional
-
trajectory
(stride=1, center_of_mass=False, atom_indices=None, snapshot_indices=None)[source]¶ Parameters: - stride (int) – The trajectories are generated with every ‘stride’ steps
- center_of_mass (list/numpy.ndarray) – The center of mass
- atom_indices (list/numpy.ndarray) – The atom indices for which the trajectory should be generated
- snapshot_indices (list/numpy.ndarray) – The snapshots for which the trajectory should be generated
Returns: Trajectory instance
Return type:
-
view_structure
(snapshot=-1, spacefill=True, show_cell=True)[source]¶ Parameters: - snapshot (int) – Snapshot of the trajectory one wants
- spacefill (bool) –
- show_cell (bool) –
Returns: nglview IPython widget
Return type: view
-
write_traj
(filename, file_format=None, parallel=True, append=False, stride=1, center_of_mass=False, atom_indices=None, snapshot_indices=None, **kwargs)[source]¶ Writes the trajectory in a given file file_format based on the ase.io.write function.
Parameters: - filename (str) – Filename of the output
- file_format (str) – The specific file_format of the output
- parallel (bool) –
- append (bool) –
- stride (int) – Writes trajectory every stride steps
- center_of_mass (bool) – True if the positions are centered on the COM
- atom_indices (list/numpy.ndarray) – The atom indices for which the trajectory should be generated
- snapshot_indices (list/numpy.ndarray) – The snapshots for which the trajectory should be generated
- **kwargs – Additional ase arguments
-
class
pyiron.atomistics.job.atomistic.
GenericInput
(input_file_name=None, table_name='generic')[source]¶
-
class
pyiron.atomistics.job.atomistic.
GenericOutput
(job)[source]¶ Bases:
object
-
cells
¶
-
computation_time
¶
-
energy_pot
¶
-
energy_tot
¶
-
forces
¶
-
positions
¶
-
pressures
¶
-
steps
¶
-
temperature
¶
-
unwrapped_positions
¶
-
volume
¶
-
-
class
pyiron.atomistics.job.atomistic.
Trajectory
(positions, structure, center_of_mass=False, cells=None)[source]¶ Bases:
object
A trajectory instance compatible with the ase.io class
Parameters: - positions (numpy.ndarray) – The array of the trajectory in cartesian coordinates
- structure (pyiron.atomistics.structure.atoms.Atoms) – The initial structure instance from which the species info is derived
- center_of_mass (bool) – False (default) if the specified positions are w.r.t. the origin
- cells (numpy.ndarray) – Optional argument of the cell shape at every time step (Nx3x3 array) when the volume varies
pyiron.atomistics.job.interactive module¶
-
class
pyiron.atomistics.job.interactive.
GenericInteractive
(project, job_name)[source]¶ Bases:
pyiron.atomistics.job.atomistic.AtomisticGenericJob
,pyiron.base.job.interactive.InteractiveBase
-
current_structure
¶
-
get_structure
(iteration_step=-1)[source]¶ Gets the structure from a given iteration step of the simulation (MD/ionic relaxation). For static calculations there is only one ionic iteration step :param iteration_step: Step for which the structure is requested :type iteration_step: int
Returns: atomistics.structure.atoms.Atoms object
-
initial_structure
¶
-
interactive_enforce_structure_reset
¶
-
interactive_flush
(path='interactive', include_last_step=False)[source]¶ Parameters: - path –
- include_last_step –
Returns:
-
run_if_interactive
()[source]¶ For jobs which executables are available as Python library, those can also be executed with a library call instead of calling an external executable. This is usually faster than a single core python job.
-
structure
¶ type: Returns
-
-
class
pyiron.atomistics.job.interactive.
GenericInteractiveOutput
(job)[source]¶ Bases:
pyiron.atomistics.job.atomistic.GenericOutput
-
cells
¶
-
energy_pot
¶
-
energy_tot
¶
-
forces
¶
-
indices
¶
-
positions
¶
-
pressures
¶
-
steps
¶
-
temperature
¶
-
time
¶
-
unwrapped_positions
¶
-
volume
¶
-
pyiron.atomistics.job.interactivewrapper module¶
-
class
pyiron.atomistics.job.interactivewrapper.
InteractiveWrapper
(project, job_name)[source]¶ Bases:
pyiron.base.master.generic.GenericMaster
-
collect_logfiles
()[source]¶ Collect the log files of the external executable and store the information in the HDF5 file. This method has to be implemented in the individual hamiltonians.
-
collect_output
()[source]¶ Collect the output files of the external executable and store the information in the HDF5 file. This method has to be implemented in the individual hamiltonians.
-
from_hdf
(hdf=None, group_name=None)[source]¶ Restore the InteractiveWrapper from an HDF5 file
Parameters: - hdf (ProjectHDFio) – HDF5 group object - optional
- group_name (str) – HDF5 subgroup name - optional
-
ref_job
¶ Get the reference job template from which all jobs within the ParallelMaster are generated.
Returns: reference job Return type: GenericJob
-
structure
¶
-
to_hdf
(hdf=None, group_name=None)[source]¶ Store the InteractiveWrapper in an HDF5 file
Parameters: - hdf (ProjectHDFio) – HDF5 group object - optional
- group_name (str) – HDF5 subgroup name - optional
-
pyiron.atomistics.job.potentials module¶
An abstract Potential class to provide an easy access for the available potentials. Currently implemented for the OpenKim https://openkim.org database.
-
class
pyiron.atomistics.job.potentials.
PotentialAbstract
(potential_df, default_df=None, selected_atoms=None)[source]¶ Bases:
object
The PotentialAbstract class loads a list of available potentials and sorts them. Afterwards the potentials can be accessed through:
PotentialAbstract.<Element>.<Element> or PotentialAbstract.find_potentials_set({<Element>, <Element>}Parameters: - potential_df –
- default_df –
- selected_atoms –
pyiron.atomistics.job.structurecontainer module¶
-
class
pyiron.atomistics.job.structurecontainer.
StructureContainer
(project, job_name)[source]¶ Bases:
pyiron.atomistics.job.atomistic.AtomisticGenericJob
-
append
(structure_to_append)[source]¶ Metajobs like GenericMaster, ParallelMaster, SerialMaser or ListMaster allow other jobs to be appended. In the GenericJob definition this is only a template function.
-
from_hdf
(hdf=None, group_name=None)[source]¶ Recreates instance from the hdf5 file :param hdf: Path to the hdf5 file :type hdf: str :param group_name: Name of the group which contains the object :type group_name: str
-
run_if_interactive
()[source]¶ For jobs which executables are available as Python library, those can also be executed with a library call instead of calling an external executable. This is usually faster than a single core python job.
-
structure
¶ type: Returns
-
to_hdf
(hdf=None, group_name=None)[source]¶ Store the GenericJob in an HDF5 file
Parameters: - hdf (ProjectHDFio) – HDF5 group object - optional
- group_name (str) – HDF5 subgroup name - optional
-