pyiron.base.master package

Submodules

pyiron.base.master.flexible module

class pyiron.base.master.flexible.FlexibleMaster(project, job_name)[source]

Bases: pyiron.base.master.generic.GenericMaster

The FlexibleMaster uses a list of functions to connect multiple jobs in a series.

Parameters
  • project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in

  • job_name (str) – name of the job, which has to be unique within the project

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted,

running, aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: version

Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.

.. attribute:: executable

Executable used to run the job - usually the path to an external executable.

.. attribute:: library_activated

For job types which offer a Python library pyiron can use the python library instead of an external executable.

.. attribute:: server

Server object to handle the execution environment for the job.

.. attribute:: queue_id

the ID returned from the queuing system - it is most likely not the same as the job ID.

.. attribute:: logger

logger object to monitor the external execution and internal pyiron warnings.

.. attribute:: restart_file_list

list of files which are used to restart the calculation from these files.

.. attribute:: job_type
Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’,

‘ScriptJob’, ‘ListMaster’]

.. attribute:: child_names

Dictionary matching the child ID to the child job name.

collect_output()[source]

Skip the collect output function - it is not used for the FlexibleMaster

from_hdf(hdf=None, group_name=None)[source]

Restore the FlexibleMaster from an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

function_lst
is_finished()[source]

Check if the ParallelMaster job is finished - by checking the job status and the submission status.

Returns

[True/False]

Return type

bool

run_static()[source]

The FlexibleMaster uses functions to connect multiple Jobs.

to_hdf(hdf=None, group_name=None)[source]

Store the FlexibleMaster in an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

validate_ready_to_run()[source]

Validate that the calculation is ready to be executed. By default no generic checks are performed, but one could check that the input information is complete or validate the consistency of the input at this point.

write_input()[source]

Skip the write input function - it is not used for the FlexibleMaster

pyiron.base.master.generic module

class pyiron.base.master.generic.GenericMaster(project, job_name)[source]

Bases: pyiron.base.job.generic.GenericJob

The GenericMaster is the template class for all meta jobs - meaning all jobs which contain multiple other jobs. It defines the shared functionality of the different kind of job series.

Parameters
  • project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in

  • job_name (str) – name of the job, which has to be unique within the project

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted,

running, aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: version

Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.

.. attribute:: executable

Executable used to run the job - usually the path to an external executable.

.. attribute:: library_activated

For job types which offer a Python library pyiron can use the python library instead of an external executable.

.. attribute:: server

Server object to handle the execution environment for the job.

.. attribute:: queue_id

the ID returned from the queuing system - it is most likely not the same as the job ID.

.. attribute:: logger

logger object to monitor the external execution and internal pyiron warnings.

.. attribute:: restart_file_list

list of files which are used to restart the calculation from these files.

.. attribute:: job_type
Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’,

‘ScriptJob’, ‘ListMaster’]

.. attribute:: child_names

Dictionary matching the child ID to the child job name.

append(job)[source]

Append a job to the GenericMaster - just like you would append an element to a list.

Parameters

job (GenericJob) – job to append

child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

Returns

list of child job ids

Return type

list

child_names

Dictionary matching the child ID to the child job name

Returns

{child_id: child job name }

Return type

dict

copy_to(project=None, new_job_name=None, input_only=False, new_database_entry=True)[source]

Copy the content of the job including the HDF5 file to a new location

Parameters
  • project (ProjectHDFio) – project to copy the job to

  • new_job_name (str) – to duplicate the job within the same porject it is necessary to modify the job name - optional

  • input_only (bool) – [True/False] to copy only the input - default False

  • new_database_entry (bool) – [True/False] to create a new database entry - default True

Returns

GenericJob object pointing to the new location.

Return type

GenericJob

first_child_name()[source]

Get the name of the first child job

Returns

name of the first child job

Return type

str

from_hdf(hdf=None, group_name=None)[source]

Restore the GenericMaster from an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

get_child_cores()[source]

Calculate the currently active number of cores, by summarizing all childs which are neither finished nor aborted.

Returns

number of cores used

Return type

(int)

static get_function_from_string(function_str)[source]

Convert a string of source code to a function

Parameters

function_str – function source code

Returns

Return type

function

move_to(project)[source]

Move the content of the job including the HDF5 file to a new location

Parameters

project (ProjectHDFio) – project to move the job to

Returns

JobCore object pointing to the new location.

Return type

JobCore

pop(i=-1)[source]

Pop a job from the GenericMaster - just like you would pop an element from a list

Parameters

i (int) – position of the job. (Default is last element, -1.)

Returns

job

Return type

GenericJob

set_child_id_func(child_id_func)[source]

Add an external function to derive a list of child IDs - experimental feature

Parameters

child_id_func (Function) – Python function which returns the list of child IDs

to_hdf(hdf=None, group_name=None)[source]

Store the GenericMaster in an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

validate_ready_to_run()[source]

Validate that the calculation is ready to be executed. By default no generic checks are performed, but one could check that the input information is complete or validate the consistency of the input at this point.

pyiron.base.master.list module

class pyiron.base.master.list.ListMaster(project, job_name)[source]

Bases: pyiron.base.master.generic.GenericMaster

The ListMaster is the most simple MetaJob derived from the GenericMaster. It behaves like a Python list object. Jobs can be append to the ListMaster just like elements are added to a list and then all jobs can be executed together. This also works for already executed jobs, unless they are already linked to a different MetaJob - meaning they already have a master ID assigned to them.

Parameters
  • project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in

  • job_name (str) – name of the job, which has to be unique within the project

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted, running,

aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: version

Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.

.. attribute:: executable

Executable used to run the job - usually the path to an external executable.

.. attribute:: library_activated

For job types which offer a Python library pyiron can use the python library instead of an external executable.

.. attribute:: server

Server object to handle the execution environment for the job.

.. attribute:: queue_id

the ID returned from the queuing system - it is most likely not the same as the job ID.

.. attribute:: logger

logger object to monitor the external execution and internal pyiron warnings.

.. attribute:: restart_file_list

list of files which are used to restart the calculation from these files.

.. attribute:: job_type
Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’, ‘ScriptJob’,

‘ListMaster’]

.. attribute:: child_names

Dictionary matching the child ID to the child job name.

.. attribute:: submission_status

Monitors how many jobs have been submitted and how many have to be submitted in future.

append(job)[source]

Append a job to the ListMaster - just like you would append an element to a list.

Parameters

job (JobCore, GenericJob, int) – job to append

copy()[source]

Copy the ListMaster object which links to the job and its HDF5 file

Returns

New ListMaster object pointing to the same job

Return type

ListMaster

is_finished()[source]

Check if the ListMaster job is finished - by checking the job status and the submission status.

Returns

[True/False]

Return type

bool

iter_jobs(convert_to_object=True)[source]

Iterate over the jobs within the ListMaster

Parameters

convert_to_object (bool) – load the full GenericJob object (default) or just the HDF5 / JobCore object

Returns

Yield of GenericJob or JobCore

Return type

yield

refresh_submission_status()[source]

Refresh the submission status - if a job ID job_id is set then the submission status is loaded from the database.

reset_job_id(job_id=None)[source]

Reset the job id sets the job_id to None as well as all connected modules like JobStatus and SubmissionStatus.

run_static()[source]

The run static function is called by run to execute the simulation. For the ListMaster this means executing all the childs appened in parallel.

save()[source]

Save the object, by writing the content to the HDF5 file and storing an entry in the database.

Returns

Job ID stored in the database

Return type

(int)

write_input()[source]

Write the input files - for the ListMaster this only contains the execution mode, which is ‘parallel’ by default.

pyiron.base.master.parallel module

class pyiron.base.master.parallel.GenericOutput[source]

Bases: collections.OrderedDict

Generic Output just a place holder to store the output of the last child directly in the ParallelMaster.

class pyiron.base.master.parallel.JobGenerator(job)[source]

Bases: object

JobGenerator - this class implements the functions to generate the parameter list, modify the individual jobs according to the parameter list and generate the new job names according to the parameter list.

static modify_job(job, parameter)[source]
next()[source]

Iterate over the child jobs

Returns

new job object

Return type

GenericJob

parameter_list
parameter_list_cached
class pyiron.base.master.parallel.ParallelMaster(project, job_name)[source]

Bases: pyiron.base.master.generic.GenericMaster

MasterJob that handles the creation and analysis of several parallel jobs (including master and continuation jobs), Examples are Murnaghan or Phonon calculations

Parameters
  • project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in

  • job_name (str) – name of the job, which has to be unique within the project

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted,

running, aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: version

Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.

.. attribute:: executable

Executable used to run the job - usually the path to an external executable.

.. attribute:: library_activated

For job types which offer a Python library pyiron can use the python library instead of an external executable.

.. attribute:: server

Server object to handle the execution environment for the job.

.. attribute:: queue_id

the ID returned from the queuing system - it is most likely not the same as the job ID.

.. attribute:: logger

logger object to monitor the external execution and internal pyiron warnings.

.. attribute:: restart_file_list

list of files which are used to restart the calculation from these files.

.. attribute:: job_type
Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’,

‘ScriptJob’, ‘ListMaster’]

.. attribute:: child_names

Dictionary matching the child ID to the child job name.

.. attribute:: ref_job

Reference job template from which all jobs within the ParallelMaster are generated.

.. attribute:: number_jobs_total

Total number of jobs

collect_logfiles()[source]

Collect the log files of the external executable and store the information in the HDF5 file. This method is currently not implemented for the ParallelMaster.

collect_output()[source]

Collect the output files of the external executable and store the information in the HDF5 file. This method has to be implemented in the individual meta jobs derived from the ParallelMaster.

copy()[source]

Copy the GenericJob object which links to the job and its HDF5 file

Returns

New GenericJob object pointing to the same job

Return type

GenericJob

copy_to(project=None, new_job_name=None, input_only=False, new_database_entry=True)[source]

Copy the content of the job including the HDF5 file to a new location

Parameters
  • project (ProjectHDFio) – project to copy the job to

  • new_job_name (str) – to duplicate the job within the same porject it is necessary to modify the job name - optional

  • input_only (bool) – [True/False] to copy only the input - default False

  • new_database_entry (bool) – [True/False] to create a new database entry - default True

Returns

GenericJob object pointing to the new location.

Return type

GenericJob

from_hdf(hdf=None, group_name=None)[source]

Restore the ParallelMaster from an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

interactive_ref_job_initialize()[source]

To execute the reference job in interactive mode it is necessary to initialize it.

is_finished()[source]

Check if the ParallelMaster job is finished - by checking the job status and the submission status.

Returns

[True/False]

Return type

bool

iter_jobs(convert_to_object=True)[source]

Iterate over the jobs within the ListMaster

Parameters

convert_to_object (bool) – load the full GenericJob object (default) or just the HDF5 / JobCore object

Returns

Yield of GenericJob or JobCore

Return type

yield

number_jobs_total

Get number of total jobs

Returns

number of total jobs

Return type

int

output_to_pandas(sort_by=None, h5_path='output')[source]

Convert output of all child jobs to a pandas Dataframe object.

Parameters
  • sort_by (str) – sort the output using pandas.DataFrame.sort_values(by=sort_by)

  • h5_path (str) – select child output to include - default=’output’

Returns

output as dataframe

Return type

pandas.Dataframe

ref_job

Get the reference job template from which all jobs within the ParallelMaster are generated.

Returns

reference job

Return type

GenericJob

refresh_submission_status()[source]

Refresh the submission status - if a job ID job_id is set then the submission status is loaded from the database.

reset_job_id(job_id=None)[source]

Reset the job id sets the job_id to None as well as all connected modules like JobStatus and SubmissionStatus.

run_if_interactive()[source]

For jobs which executables are available as Python library, those can also be executed with a library call instead of calling an external executable. This is usually faster than a single core python job.

run_static()[source]

The run_static function is executed within the GenericJob class and depending on the run_mode of the Parallelmaster and its child jobs a more specific run function is selected.

save()[source]

Save the object, by writing the content to the HDF5 file and storing an entry in the database.

Returns

Job ID stored in the database

Return type

(int)

show_hdf()[source]

Display the output of the child jobs in a human readable print out

to_hdf(hdf=None, group_name=None)[source]

Store the ParallelMaster in an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

write_input()[source]

Write the input files - this contains the GenericInput of the ParallelMaster as well as reseting the submission status.

pyiron.base.master.serial module

class pyiron.base.master.serial.GenericOutput[source]

Bases: collections.OrderedDict

Generic Output just a place holder to store the output of the last child directly in the SerialMaster.

class pyiron.base.master.serial.SerialMasterBase(project, job_name)[source]

Bases: pyiron.base.master.generic.GenericMaster

The serial master class is a metajob consisting of a dynamic list of jobs which are executed in serial mode. The job is derived from the GenericMaster.

Parameters
  • project (ProjectHDFio) – ProjectHDFio instance which points to the HDF5 file the job is stored in

  • job_name (str) – name of the job, which has to be unique within the project

.. attribute:: job_name

name of the job, which has to be unique within the project

.. attribute:: status
execution status of the job, can be one of the following [initialized, appended, created, submitted, running,

aborted, collect, suspended, refresh, busy, finished]

.. attribute:: job_id

unique id to identify the job in the pyiron database

.. attribute:: parent_id

job id of the predecessor job - the job which was executed before the current one in the current job series

.. attribute:: master_id

job id of the master job - a meta job which groups a series of jobs, which are executed either in parallel or in serial.

.. attribute:: child_ids

list of child job ids - only meta jobs have child jobs - jobs which list the meta job as their master

.. attribute:: project

Project instance the jobs is located in

.. attribute:: project_hdf5

ProjectHDFio instance which points to the HDF5 file the job is stored in

.. attribute:: job_info_str

short string to describe the job by it is job_name and job ID - mainly used for logging

.. attribute:: working_directory

working directory of the job is executed in - outside the HDF5 file

.. attribute:: path

path to the job as a combination of absolute file system path and path within the HDF5 file.

.. attribute:: version

Version of the hamiltonian, which is also the version of the executable unless a custom executable is used.

.. attribute:: executable

Executable used to run the job - usually the path to an external executable.

.. attribute:: library_activated

For job types which offer a Python library pyiron can use the python library instead of an external executable.

.. attribute:: server

Server object to handle the execution environment for the job.

.. attribute:: queue_id

the ID returned from the queuing system - it is most likely not the same as the job ID.

.. attribute:: logger

logger object to monitor the external execution and internal pyiron warnings.

.. attribute:: restart_file_list

list of files which are used to restart the calculation from these files.

.. attribute:: job_type
Job type object with all the available job types: [‘ExampleJob’, ‘SerialMaster’, ‘ParallelMaster’, ‘ScriptJob’,

‘ListMaster’]

.. attribute:: child_names

Dictionary matching the child ID to the child job name.

.. attribute:: start_job

The first job of the series.

.. attribute:: input

The input of the start job - the first job of the series.

collect_logfiles()[source]

The collect logfiles function is required by the GenericJob class, therefore we use an empty template here.

collect_output()[source]

Collect the output files of the individual jobs and set the output of the last job to be the output of the SerialMaster - so the SerialMaster contains the same output as its last child.

copy()[source]

Copy the GenericJob object which links to the job and its HDF5 file

Returns

New GenericJob object pointing to the same job

Return type

GenericJob

create_next(job_name=None)[source]

Create the next job in the series by duplicating the previous job.

Parameters

job_name (str) – name of the new job - optional - default=’job_<index>’

Returns

next job

Return type

GenericJob

from_hdf(hdf=None, group_name=None)[source]

Restore the SerialMaster from an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

get_from_childs(path)[source]

Extract the output from all child jobs and appending it to a list

Parameters

path (str) – path inside the HDF5 files of the individual jobs like ‘output/generic/volume’

Returns

list of output from the child jobs

Return type

list

get_initial_child_name()[source]

Get name of the initial child.

Returns

name of the initial child

Return type

str

input

Get the input of the start job - the first job of the series.

Returns

input of the start job

Return type

GenericParameters

iter_jobs(convert_to_object=True)[source]

Iterate over the jobs within the SerialMaster

Parameters

convert_to_object (bool) – load the full GenericJob object (default) or just the HDF5 / JobCore object

Returns

Yield of GenericJob or JobCore

Return type

yield

ref_job
run_static(**qwargs)[source]

The run static function is called by run to execute the simulation.

set_goal(convergence_goal, **qwargs)[source]

Set a convergence goal for the SerialMaster - this is necessary to stop the series.

Parameters
  • convergence_goal (Function) – the convergence goal can be any Python function, but if external packages are used like numpy they have to be imported within the function.

  • **qwargs – arguments of the convergence goal function.

show()[source]

list all jobs in the SerialMaster

Returns

list of jobs [‘job’, <index>, <GenericJob>]

Return type

list

start_job

Get the first job of the series.

Returns

start job

Return type

GenericJob

to_hdf(hdf=None, group_name=None)[source]

Store the SerialMaster in an HDF5 file

Parameters
  • hdf (ProjectHDFio) – HDF5 group object - optional

  • group_name (str) – HDF5 subgroup name - optional

write_input()[source]

Write the input files - for the SerialMaster this only contains convergence goal.

pyiron.base.master.submissionstatus module

class pyiron.base.master.submissionstatus.SubmissionStatus(initial_status='initialized', db=None, job_id=None)[source]

Bases: object

The SubmissionStatus object handles the different submission states a job could have. The available states are:

initialized: No jobs have been submitted. sub_m_n: m out of n jobs have been submitted finished: The job and all connected sub jobs are finished.

Parameters
  • initial_status (str) – If no initial status is provided the status is set to ‘initialized’

  • db (DatabaseAccess) – The database which is responsible for this job.

  • job_id – job ID

.. attribute:: database

the database which is responsible for this job.

.. attribute:: job_id

Job ID

.. attribute:: string

job status as string

.. attribute:: total_jobs

number of jobs which have to be submitted in total

.. attribute:: submitted_jobs

number of jobs which have been submitted

STATUS = ['initialized', 'finished']
database

Get the database which is responsible for this job. If no database is linked it returns None.

Returns

The database which is responsible for this job.

Return type

DatabaseAccess

finished

Check if the status is ‘finished’, meaning the job and all connected sub jobs are finished.

Returns

[True/False]

Return type

bool

initialized

Check if the status is ‘initialized’, meaning the object for the corresponding job was just created.

Returns

[True/False]

Return type

bool

refresh()[source]

Refresh the submission status, if a job_id is present load the current submission status from the database.

string
Get the current status as string, it can be:

initialized: No jobs have been submitted. sub_m_n: m out of n jobs have been submitted finished: The job and all connected sub jobs are finished.

Returns

status [initialized, finished]

Return type

str

submit_next()[source]

Increasing the number of submitted jobs by one: self.submitted_jobs += 1

submitted

Check if the status is ‘submitted’, meaning the job has not yet submitted all jobs.

Returns

[True/False]

Return type

bool

submitted_jobs

Get the number of jobs which have been submitted.

Returns

number of submitted jobs

Return type

int

total_jobs

Get number of total jobs

Returns

number of total jobs

Return type

int

Module contents