Welcome to COMPS¶
The COmputational Modeling Platform Service (COMPS) is an online tool that provides for submitting, running, and managing computational simulations on high-performance computing clusters (HPC).
pyCOMPS is a Python package for interoperability with the COMPS REST API. It can be used as an interface for other tools such as idmtools or from standalone custom scripts.
Overview of COMPS¶
The COmputational Modeling Platform Service (COMPS) is a web-based user interface that facilitates research by providing access to high-performance computing environments. COMPS allows for submitting, running, and managing computational simulations using EMOD. For more information about EMOD, see Overview of EMOD software and Model overview.
Note
To access and use COMPS you must receive approval and credentials from IDM. Send your request to support@idmod.org.
The following table lists some of the core features:
Dashboard |
The COMPS dashboard provides an overview of computing cluster usage, including current and queued jobs. Resource management is simple due to the job-priority system used by the platform. |
Multi-Chart |
COMPS provides powerful charting functionality to visualize the output channels for simulations. A chart can include output for a single simulation or for multiple simulations. Viewing multiple simulations in a single chart (multi-chart) provides a fast, flexible way to filter simulations to view only data of interest. |
Weather Visualization |
COMPS creates input files for demographics, migration, and weather to use in a simulation. The spatial and temporal weather data includes air temperature, relative humidity, and rainfall for many geographic regions across the globe using weather station readings and satellite data. COMPS provides a visualization of weather patterns over time overlaid on regional maps. |
COMPS terminology¶
COMPS introduces the terminology:
Experiment |
Experiment is a logical grouping of simulations. This allows for managing numerous simulations as a single unit or grouping. |
Suite |
Suite is a logical grouping of experiments. This allows for managing multiple experiments as a single unit or grouping. |
Asset Collections |
Collection of user created input files, such as demographics, temperature, weather, and overlay files. These files are stored in COMPS and can be available for use by other users. |
The following diagram helps illustrate the relationship:

Create simulations¶
COMPS allows for creating and submitting individual simulations to be run on EMOD.
Create simulations¶
You can use COMPS to create and submit simulations to be run by EMOD.
Simulations must contain a JSON-formatted Configuration file. For more information about configuration files, see Configuration file. They may also contain a Campaign file and Additional Files, such as a readme file, input files, or any files you would like to be included and associated with the simulation. For more information about campaign files, see Campaign file. For more information about input files, see Input files.
For step-by-step instructions on using COMPS to create a simulation, see:
How to create simulations¶
Create simulation¶
Follow the steps below to create a simulation.
On the top left corner of COMPS, click the hamburger icon.
Click Create.
Click Simulation.
Under Configuration, click Choose File.
Select a JSON-formatted configuration file.
Optional : Under Campaign, click Choose File.
Optional : Under Additional Files, click Choose File(s).
Under Meta, enter a name and then click Create!.
To see the specific configuration file used in this documentation set, see Example JSON configuration file. To see the specific campaign file used in this documentation set, see Example JSON campaign file.
Example JSON configuration file¶
To use this example configuration file along with the included campaign file, select EMOD 2.7 and SamplesInput from the drop-down boxes that appear after having selected the configuration file while creating a new simulation.
Configuration file¶
{
"parameters": {
"Acquisition_Blocking_Immunity_Decay_Rate": 0.5,
"Acquisition_Blocking_Immunity_Duration_Before_Decay": 60,
"Age_Initialization_Distribution_Type": "DISTRIBUTION_SIMPLE",
"Animal_Reservoir_Type": "NO_ZOONOSIS",
"Base_Incubation_Period": 1,
"Base_Individual_Sample_Rate": 1,
"Base_Infectious_Period": 4,
"Base_Infectivity": 0.9357678407400043,
"Base_Mortality": 0,
"Base_Population_Scale_Factor": 1,
"Birth_Rate_Dependence": "POPULATION_DEP_RATE",
"Birth_Rate_Time_Dependence": "NONE",
"Burnin_Cache_Mode": "none",
"Burnin_Cache_Period": 0,
"Burnin_Name": "",
"Campaign_Filename": "campaign.json",
"Climate_Model": "CLIMATE_OFF",
"Config_Name": "00_Generic_DEFAULT",
"Custom_Reports_Filename": "NoCustomReports",
"Death_Rate_Dependence": "NONDISEASE_MORTALITY_OFF",
"Default_Geography_Initial_Node_Population": 1000,
"Default_Geography_Torus_Size": 10,
"Demographics_Filenames": [
"generic_scenarios_demographics.json"
],
"Enable_Absolute_Time": "NO",
"Enable_Aging": 1,
"Enable_Birth": 1,
"Enable_Default_Reporting": 1,
"Enable_Default_Shedding_Function": 1,
"Enable_Demographics_Birth": 0,
"Enable_Demographics_Builtin": 0,
"Enable_Demographics_Gender": 1,
"Enable_Demographics_Initial": 1,
"Enable_Demographics_Other": 0,
"Enable_Demographics_Reporting": 1,
"Enable_Disease_Mortality": 0,
"Enable_Heterogeneous_Intranode_Transmission": 0,
"Enable_Immune_Decay": 0,
"Enable_Immunity": 1,
"Enable_Interventions": 1,
"Enable_Maternal_Transmission": 0,
"Enable_Property_Output": 0,
"Enable_Spatial_Output": 0,
"Enable_Superinfection": 0,
"Enable_Vital_Dynamics": 0,
"Geography": "",
"Immunity_Acquisition_Factor": 0,
"Immunity_Initialization_Distribution_Type": "DISTRIBUTION_OFF",
"Immunity_Mortality_Factor": 0,
"Immunity_Transmission_Factor": 0,
"Incubation_Period_Distribution": "FIXED_DURATION",
"Individual_Sampling_Type": "TRACK_ALL",
"Infectivity_Scale_Type": "CONSTANT_INFECTIVITY",
"Infection_Updates_Per_Timestep": 1,
"Infectious_Period_Distribution": "EXPONENTIAL_DURATION",
"Job_Node_Groups": "Chassis08",
"Job_Priority": "BELOWNORMAL",
"Listed_Events": [],
"Load_Balance_Filename": "",
"Local_Simulation": 0,
"Maternal_Transmission_Probability": 0,
"Max_Individual_Infections": 1,
"Max_Node_Population_Samples": 40,
"Migration_Model": "NO_MIGRATION",
"Minimum_Adult_Age_Years": 15,
"Mortality_Blocking_Immunity_Decay_Rate": 0.001,
"Mortality_Blocking_Immunity_Duration_Before_Decay": 60,
"Mortality_Time_Course": "DAILY_MORTALITY",
"Node_Grid_Size": 0.042,
"Number_Basestrains": 1,
"Number_Substrains": 1,
"Num_Cores": 1,
"PKPD_Model": "FIXED_DURATION_CONSTANT_EFFECT",
"Population_Density_C50": 30,
"Population_Density_Infectivity_Correction": "CONSTANT_INFECTIVITY",
"Population_Scale_Type": "USE_INPUT_FILE",
"Report_Event_Recorder": 0,
"Run_Number": 1,
"Sample_Rate_Birth": 1,
"Sample_Rate_0_18mo": 1,
"Sample_Rate_10_14": 1,
"Sample_Rate_15_19": 1,
"Sample_Rate_18mo_4yr": 1,
"Sample_Rate_20_Plus": 1,
"Sample_Rate_5_9": 1,
"Serialization_Test_Cycles": 0,
"Simulation_Duration": 3616,
"Simulation_Timestep": 1,
"Simulation_Type": "GENERIC_SIM",
"Start_Time": 0,
"Susceptibility_Scale_Type": "CONSTANT_SUSCEPTIBILITY",
"Transmission_Blocking_Immunity_Decay_Rate": 0.1,
"Transmission_Blocking_Immunity_Duration_Before_Decay": 60,
"x_Air_Migration": 1,
"x_Birth": 1,
"x_Local_Migration": 1,
"x_Other_Mortality": 1,
"x_Population_Immunity": 1,
"x_Regional_Migration": 1,
"x_Sea_Migration": 1,
"x_Temporary_Larval_Habitat": 1
}
}
Example JSON campaign file¶
To use this example campaign file along with the included configuration file, select EMOD 2.7 and SamplesInput from the drop-down boxes that appear after having selected the configuration file while creating a new simulation.
Campaign file¶
{
"Use_Defaults": 1,
"Campaign_Name": "Initial Seeding",
"Events": [
{
"Event_Coordinator_Config": {
"Intervention_Config": {
"Outbreak_Source": "PrevalenceIncrease",
"Antigen": 0,
"class": "OutbreakIndividual",
"Genome": 0
},
"Timesteps_Between_Repetitions": 1742,
"class": "StandardInterventionDistributionEventCoordinator",
"Target_Demographic": "Everyone",
"Demographic_Coverage": 0.6388961673980895
},
"Start_Day": 1,
"class": "CampaignEvent",
"Event_Name": "Outbreak",
"Nodeset_Config": {
"class": "NodeSetAll"
}
}
]
}
Create climate input¶
COMPS enables you to create climate files for selected areas over selected time scales. Climate files (pre-made datasets) for many regions are included in COMPS. You can also create and customize your own dataset. The climate input files generated with COMPS can then be used as additional data input for your simulations.
How to create climate input¶
COMPS allows you to create climate input files that can be used as additional data for your simulations.
Create climate input¶
Follow the steps below to create climate input files.
On the top left corner of COMPS, click the hamburger icon.
Click Create, and then click Model Input Files.
Click Search Administrative Districts and enter a location for which you would like to generate climate data. In this example, ‘vietnam’ was searched for. The region is then highlighted in green on the map.
Click the returned search value, which in this example is:
The selected area will then turn red:
Under Layers, acccept the selected default values (Climate Projects and Selected Nodes).
Click Submit.
The Create Input Files dialogue box will then appear
Under Project, select the corresponding project name for the region you selected. In this example, it is IDM-Vietnam.
Click Generate to create the climate input files.
Manage simulations¶
COMPS provides a graphical user interface for viewing and managing your simulations. For example, you can perform the following with COMPS:
Use dashboard to monitor compute cluster usage and status of jobs, including current and queued items. |
View and analyze data of interest for multiple simulations in a single chart (multi-chart viewing). |
View and analyze weather patterns over time from satellite data and weather station readings overlaid on regional maps. |
Clean up the resources consumed during the process of running simulations. |
How to use dashboard for managing simulations¶
The dashboard shows numerous graphs and breakdowns that help to visualize who’s using which resources and when. These graphs break down the amount of space used by each user and on what day, as well as the time spent using the cluster. These metrics facilitate team management of disk space limitations and time usage.
Use dashboard for managing simulations¶
From the dashboard view in COMPS, you can view and analyze the following:
Simulations Currently in Queue
Workflows Currently Processing
Core Processing Time Consumed
Home Disk Space in Use
Simulations Processed per Owner
Cluster Disk Space in Use
Simulations Processed per State
Simulations Processed per Node Group
Follow the steps below to use dashboard for managing simulations.
Multi-chart views of simulations¶
COMPS allows you to create a chart of the results from your completed simulations for comparison and analysis.
Create multi-chart views¶
Follow the steps below to create multi-chart views of simulations.
On the top left corner of COMPS, click the hamburger icon.
Click Explore, and then click Simulations. Select the succeeded simulations you would like to chart.
Note
You can also create multi-chart views under Experiments by selecting an experiment, clicking on Succeeded, and then clicking Chart.
On the top, click Chart.
Accept the defaults and click Create Chart.
You can then compare and begin analysis of the results of your simulations.
You can also select different parameters for charting analysis.
How to view weather data visualizations¶
COMPS includes spatial and temporal weather data from satellite data and weather station readings. This includes air temperature, relative humidity, and rainfall for many geographic regions across the globe.
View weather data visualizations¶
Follow the steps below to view weather data visualizations.
On the top left corner of COMPS, click the hamburger icon.
Click Create, click Model Input Files.
Click drop down arrow for Climate Project.
Select one of the available IDM climate projects.
Choose a Parameter type for visualization and then select a Year. In this example, IDM-Ghana, Temperature, and 2012 were selected.
Click the play button:
You can then view the animated visualization over time of the parameter, such as temperature, overlaid on top of the selected climate project region:
You can also view and analyze the generated graph with data for the mean temperature, humidity, and rainfall:
How to use COMPS for cleaning up used resources from running simulations¶
After you’ve gathered the needed information from running your simulations you can then use COMPS to help clean up used resources. This assists with the operation and maintenance of the shared resources. Of course, before deleting any of the simulations you should ensure you’ve saved the desired files and output data beforehand. To help determine the need of cleaning up and deleting resources, you can use the dashboard to look at the metrics for Cluster Disk Space In Use and Simulations Processed per Owner. After you’ve saved the simulation files and output data and determined which simulations are safe for deleting you can then use COMPS to delete the simulations. Within COMPS simulations are logically grouped into Experiments, which are logically grouped into Suites. When deleting simulations you should start with the highest level logical grouping, assuming you are fine with having all simulations deleted within that grouping. For example, deleting at the highest level of Suites will also delete all Experiements and Simulations contained within the deleted suite.
Note
It is possible to have simulations and experiments but not suites. It depends whether or not you have run multiple iterations of experiments. For example, when you run calibrations using calibtool from DTK-Tools a suite is created along with experiments for each of the iterations run during calibration.
Use COMPS to save simulation files and output data¶
Follow the steps below to save simulation files and output data.
On top left corner of COMPS, click the hamburger icon.
Click Explore, click Simulations and select your simulation.
Click FILES, select files to download, and then click Download Selected as ZIP.
Click OUTPUT, select files to download, and then click Download Selected as ZIP.
Use dashboard view in COMPS to view resource consumption metrics¶
Follow the steps below to view resource consumption metrics.
On top left corner of COMPS, click the hamburger icon.
Click Dashboard, scroll down to view Cluster Disk Space In Use and Simulations Processed per Owner.
Adjust the settings to view your usage metrics and time periods.
This can assist when filtering on time frames for simulations to delete.
Use COMPS to delete used resources from simulations¶
First determine from which level of logical grouping (Suite, Experiment, Simulation) from which to delete your simulations. You should start with the highest level logical grouping, assuming you are fine with having all simulations deleted within that grouping. The instructions below are for deleting an experiment.
Follow the steps below to simulations contained in an experiment.
On the top left corner of COMPS, click the hamburger icon.
Click Explore, click Suites.
Filter for suites to delete, such as adding Owner and Created filters.
Select suite to delete and then click the delete trash icon. You will receive a warning notice to confirm your intent to delete.
Note
The actual deletion of simulations, after selecting and confirming what you want deleted in COMPS, can take anywhere from hours to weeks to occur. Independent “deletion” workers run in the background that defer to higher priority workers, such as running simulations.
Architecture¶
COMPS¶
For more detailed architecture information regarding COMPS and idmtools, see COMPS platform.
COMPS package¶
Subpackages¶
COMPS.Data package¶
Subpackages¶
Submodules¶
COMPS.Data.AssetCollection module¶
- class COMPS.Data.AssetCollection.AssetCollection[source]¶
Bases:
TaggableEntity
,RelatableEntity
,SerializableEntity
Represents a collection of Assets.
Once saved, an AssetCollection is immutable, other than modifying tags. It contains various properties accessible by getters:
id
date_created
It also contains “child objects” (which must be specifically requested for retrieval using the QueryCriteria.select_children() method of QueryCriteria):
tags
assets
- property id¶
- property date_created¶
- property tags¶
- property assets¶
- classmethod get(id=None, query_criteria=None)[source]¶
Retrieve one or more AssetCollections.
- Parameters:
id – The id (str or UUID) of the AssetCollection to retrieve
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of AssetCollections returned, as well as which properties and child-objects to fill for the returned AssetCollections
- Returns:
An AssetCollection or list of AssetCollections (depending on whether ‘id’ was specified) with basic properties and child-objects assigned as specified by ‘query_criteria’
- refresh(query_criteria=None)[source]¶
Update properties of an existing AssetCollection from the server.
Since AssetCollections are mostly immutable, this is usually to retrieve/update fields or child-objects that weren’t retrieved initially (e.g. assets).
- Parameters:
query_criteria – A QueryCriteria object specifying which properties and child-objects to refresh on the AssetCollection
- save(return_missing_files=False, upload_files_callback=<function AssetCollection.<lambda>>)[source]¶
Save a single AssetCollection. An id is automatically assigned upon successful save.
When the AssetCollection contains a large number or large total size of new assets that need to be uploaded, this may be done in multiple “chunks”. This allows saving of arbitrarily-large AssetCollections while avoiding potential timeouts due to long processing time on the server.
- Parameters:
return_missing_files – A boolean that determines the behavior when the AssetCollection being saved contains an AssetCollectionFile to be saved by md5 checksum (i.e. without uploading the data) that is not yet in COMPS. If true, when there are such files, return an array of UUIDs representing the md5 checksums of the missing files. If false, raise an error when there are any such files.
upload_files_callback – Callback to call whenever a batch of assets completes uploading. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of assets saved so far will be passed when it is called.
- add_asset(assetcollectionfile, file_path=None, data=None, upload_callback=<function AssetCollection.<lambda>>)[source]¶
Add an AssetCollectionFile to an AssetCollection.
The contents of the file to add can be specified either by providing a path to the file or by providing the actual data as a byte-array. Alternately, if the file/data is already in COMPS, you can skip uploading it again and just provide an AssetCollectionFile that contains the md5 checksum of the data.
If the asset exceeds AssetManager.large_asset_upload_threshold bytes in size, the asset will be uploaded immediately, separately from the saving of the main AssetCollection. This allows saving of arbitrarily-large assets while avoiding potential timeouts or having to start from scratch in case the upload is interrupted by network issues.
NOTE: this can only be called for not-yet-saved AssetCollections, since AssetCollections are immutable once saved, other than modifying tags.
NOTE: providing both file/data and an md5 is considered invalid, as providing the md5 implies the caller knows the file/data is already in COMPS and doesn’t need to be uploaded again.
- Parameters:
assetcollectionfile – An AssetCollectionFile containing the metadata for the file to add.
file_path – The path to the file to add.
data – The actual bytes of data to add.
upload_callback – Callback to call whenever a large asset upload completes saving of a chunk of the asset. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of bytes saved so far will be passed when it is called.
COMPS.Data.AssetCollectionFile module¶
- class COMPS.Data.AssetCollectionFile.AssetCollectionFile(file_name=None, relative_path=None, md5_checksum=None, tags=None)[source]¶
Bases:
AssetFile
,TaggableEntity
,SerializableEntity
Represents a single Asset in an AssetCollection.
Once created, an AssetCollectionFile is immutable, other than modifying tags. It contains various properties accessible by getters:
file_name
relative_path
md5_checksum
length
uri
tags
The md5_checksum can be used as an id for the AssetCollectionFile.
- property relative_path¶
- property tags¶
COMPS.Data.AssetFile module¶
- class COMPS.Data.AssetFile.AssetFile(file_name, md5_checksum=None)[source]¶
Bases:
SerializableEntity
A base-type for all files associated with certain entity-types. This includes AssetCollectionFile (associated with an AssetCollection), SimulationFile (associated with a Simulation), and WorkItemFile (associated with a WorkItem).
This is used only for adding properties to these file-types, and shouldn’t be created directly (should probably be an ABC).
- property file_name¶
- property md5_checksum¶
- property length¶
- property uri¶
COMPS.Data.AssetManager module¶
- COMPS.Data.AssetManager.retrieve_output_files_from_info(entity_type, entity_id, metadata, job=None, as_zip=False)[source]¶
- COMPS.Data.AssetManager.retrieve_partial_output_file_from_info(metadata, startbyte, endbyte=None, actualrange=None)[source]¶
Retrieve part of an output file from a Simulation or WorkItem.
- Parameters:
metadata – An OutputFileMetadata object representing the output files to retrieve; this is likely obtained by calling the retrieve_output_file_info() method on Simulation or WorkItem.
startbyte – An integer representing the first byte in the request range, or if negative, the number of bytes at the end of the file to return (in which case, endbyte must be None).
endbyte – An integer representing the last byte in the request range. If this value is None and startbyte is positive, this represents the end of the file.
actualrange – An optional list argument which, if passed, will contain the start byte, end byte, and total file-size upon return. This is useful if requesting “the last N bytes in the file” or “from byte N to the end” in order to know the exact bytes which were returned.
- Returns:
A byte-array of the partial output file retrieved.
COMPS.Data.AssetType module¶
COMPS.Data.BaseEntity module¶
COMPS.Data.CommissionableEntity module¶
- class COMPS.Data.CommissionableEntity.CommissionableEntity[source]¶
Bases:
object
- commission()[source]¶
Commission an entity.
If called on a Suite/Experiment, this attempts to commission all contained Simulations currently in SimulationState.Created. If called on a Simulation, this attempts to commission that Simulation. Only applicable if it is currently in SimulationState.Created. If called on a WorkItem, this attempts to commission that WorkItem. Only applicable if it is currently in WorkItemState.Created.
- cancel()[source]¶
Cancel a running entity.
If called on a Suite/Experiment, this attempts to cancel all contained Simulations currently in an ‘active’ state:
SimulationState.CommissionRequested
SimulationState.Provisioning
SimulationState.Commissioned
SimulationState.Running
SimulationState.Retry
If called on a Simulation, this attempts to commission that Simulation. Only applicable if it is currently in an ‘active’ state; see above. If called on a WorkItem, this attempts to commission that WorkItem. Only applicable if it is currently in an ‘active’ state:
WorkItemState.CommissionRequested
WorkItemState.Commissioned
WorkItemState.Validating
WorkItemState.Running
WorkItemState.Waiting
WorkItemState.ResumeRequested
WorkItemState.Resumed
- delete(expire_now=False)[source]¶
“Soft-delete” this entity.
This entity record and all associated files, etc, will be marked for deletion in COMPS. They will remain for some period of time before being permanently deleted, but will no longer be returned by the COMPS service or visible in the UI.
If called on a Suite/Experiment, this delete also applies to all contained Experiments/Simulations.
- Parameters:
expire_now – If this is set to True, this entity will be eligible for permanent deletion immediately (though depending on deletion activity in the system, it may still be a while before it’s fully deleted).
COMPS.Data.Configuration module¶
- class COMPS.Data.Configuration.Configuration(environment_name=None, simulation_input_args=None, working_directory_root=None, executable_path=None, node_group_name=None, maximum_number_of_retries=None, priority=None, min_cores=None, max_cores=None, exclusive=None, asset_collection_id=None)[source]¶
Bases:
SerializableEntity
Configuration properties associated with a Suite, Experiment, or Simulation.
A Configuration object is an immutable object containing various properties accessible by getters:
environment_name
simulation_input_args
working_directory_root
executable_path
node_group_name
maximum_number_of_retries
priority
min_cores
max_cores
exclusive
asset_collection_id
Properties of a Configuration associated with a Simulation will override properties of a Configuration associated with an Experiment, either of which will override properties of a Configuration associated with a Suite.
No properties are required at any given level in the Suite/Experiment/Simulation hierarchy, but in order to create and run a simulation, at least the environment_name and executable_name must be specified somewhere in the hierarchy.
- property environment_name¶
- property simulation_input_args¶
- property working_directory_root¶
- property executable_path¶
- property node_group_name¶
- property maximum_number_of_retries¶
- property priority¶
- property min_cores¶
- property max_cores¶
- property exclusive¶
- property asset_collection_id¶
COMPS.Data.Experiment module¶
- class COMPS.Data.Experiment.Experiment(name, suite_id=None, description=None, configuration=None)[source]¶
Bases:
TaggableEntity
,CommissionableEntity
,RelatableEntity
,SerializableEntity
Represents a grouping of Simulations.
Contains various basic properties accessible by getters (and, in some cases, +setters):
id
+suite_id
+name
+description
owner
date_created
last_modified
Also contains “child objects” (which must be specifically requested for retrieval using the QueryCriteria.select_children() method of QueryCriteria):
tags
configuration
- property id¶
- property suite_id¶
- property name¶
- property description¶
- property owner¶
- property date_created¶
- property last_modified¶
- property tags¶
- property configuration¶
- classmethod get(id=None, query_criteria=None)[source]¶
Retrieve one or more Experiments.
- Parameters:
id – The id (str or UUID) of the Experiment to retrieve
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of Experiments returned, as well as which properties and child-objects to fill for the returned Experiments
- Returns:
An Experiment or list of Experiments (depending on whether ‘id’ was specified) with basic properties and child-objects assigned as specified by ‘query_criteria’
- refresh(query_criteria=None)[source]¶
Update properties of an existing Experiment from the server.
- Parameters:
query_criteria – A QueryCriteria object specifying which properties and child-objects to refresh on the Experiment
- get_simulations(query_criteria=None)[source]¶
Retrieve Simulations contained in this Experiment.
- Parameters:
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of Simulations returned, as well as which properties and child-objects to fill for the returned Simulations
- Returns:
A list of Simulations with basic properties and child-objects assigned as specified by ‘query_criteria’
COMPS.Data.HpcJob module¶
- class COMPS.Data.HpcJob.HpcJob[source]¶
Bases:
SerializableEntity
Represents a single HPC Job.
Contains various properties accessible by getters:
job_id
job_state
priority
working_directory
output_directory_size
submit_time
start_time
end_time
error_message
configuration
HpcJobs are created by the COMPS Job Service, so they’re read-only, used for tracking HPC Jobs.
Note: Tasks are not currently used in the COMPS system, so task properties are only there for future use.
- property job_id¶
- property job_state¶
- property priority¶
- property working_directory¶
- property output_directory_size¶
- property submit_time¶
- property start_time¶
- property end_time¶
- property error_message¶
- property configuration¶
- class COMPS.Data.HpcJob.HpcState(value)[source]¶
Bases:
Enum
An enumeration representing the state of the job, as tracked by the HPC cluster.
- NotSet = 0¶
- Configuring = 1¶
- Submitted = 2¶
- Validating = 4¶
- ExternalValidation = 8¶
- Queued = 16¶
- Running = 32¶
- Finishing = 64¶
- Finished = 128¶
- Failed = 256¶
- Canceled = 512¶
- Canceling = 1024¶
COMPS.Data.OutputFileMetadata module¶
COMPS.Data.Priority module¶
COMPS.Data.QueryCriteria module¶
- class COMPS.Data.QueryCriteria.QueryCriteria[source]¶
Bases:
object
A helper class to control query return-sets by filtering on basic properties and tags, as well as controlling which properties and child-objects to fill for returned objects.
- property fields¶
- property children¶
- property filters¶
- property tag_filters¶
- property xparams¶
- select(fields)[source]¶
Set which basic properties to fill for returned objects.
- Parameters:
fields – A list of basic properties to fill; e.g. [‘id’,’description’].
- Returns:
A reference to this object so calls can be chained.
- select_children(children)[source]¶
Set which child objects to fill for returned objects.
- Parameters:
children – A list of child objects to fill; e.g. [‘tags’,’hpc_jobs’].
- Returns:
A reference to this object so calls can be chained.
- where(filters)[source]¶
Set filter criteria for basic properties.
For string filter values, ‘~’ is used for the “like”-operator (i.e. string-contains). For numeric filter values, standard arithmetic operators are allowed.
- Parameters:
filters – A list of basic property filter-criteria; e.g. [‘name~Test’,’state=Failed’].
- Returns:
A reference to this object so calls can be chained.
- where_tag(tag_filters)[source]¶
Set filter criteria for tags.
For string filter values, ‘~’ is used for the “like”-operator (i.e. string-contains). For numeric filter values, standard arithmetic operators are allowed.
- Parameters:
tag_filters – A list of tag filter-criteria; e.g. [‘Replicate=3’,’DiseaseType~Malaria’].
- Returns:
A reference to this object so calls can be chained.
- orderby(orderby_field)[source]¶
Set which basic property to sort returned results-set by.
- Parameters:
orderby_field – A string containing the basic property name to sort by. By default, ascending-sort is assumed, but descending-sort can be specified by appending a space and ‘desc’ to this argument; e.g. ‘date_created desc’.
- Returns:
A reference to this object so calls can be chained.
- offset(offset_num)[source]¶
Set the offset within the results-set to start returning results from.
- Parameters:
offset_num – An int to specify offset within the results-set.
- Returns:
A reference to this object so calls can be chained.
- count(count_num)[source]¶
Set the maximum number of results to return in the results-set.
- Parameters:
count_num – An int to specify maximum number of results to return.
- Returns:
A reference to this object so calls can be chained.
- add_extra_params(xp_dict)[source]¶
Set any parameters that aren’t otherwise explicitly supported. This allows taking advantage of future potential changes to COMPS even if pyCOMPS support is not yet implemented or using an older version of pyCOMPS.
- Parameters:
xp_dict – A dictionary of additional parameters and values to pass to the COMPS API.
- Returns:
A reference to this object so calls can be chained.
COMPS.Data.RelatableEntity module¶
- class COMPS.Data.RelatableEntity.RelatableEntity[source]¶
Bases:
object
Get all ‘parent’ workitems that this entity is related to.
- Parameters:
relation_type – A RelationType object specifying which parent related WorkItems to filter to. If none is specified, all parent related WorkItems are returned.
COMPS.Data.SerializableEntity module¶
COMPS.Data.Simulation module¶
- class COMPS.Data.Simulation.Simulation(name, experiment_id=None, description=None, configuration=None)[source]¶
Bases:
TaggableEntity
,CommissionableEntity
,RelatableEntity
,SerializableEntity
Represents a single simulation run.
Contains various basic properties accessible by getters (and, in some cases, +setters):
id
+experiment_id
+name
+description
owner
date_created
last_modified
state
error_message
Also contains “child objects” (which must be specifically requested for retrieval using the QueryCriteria.select_children() method of QueryCriteria):
tags
configuration
files
hpc_jobs
- property id¶
- property experiment_id¶
- property name¶
- property description¶
- property owner¶
- property date_created¶
- property last_modified¶
- property state¶
- property error_message¶
- property tags¶
- property configuration¶
- property files¶
- property hpc_jobs¶
- classmethod get(id=None, query_criteria=None)[source]¶
Retrieve one or more Simulations.
- Parameters:
id – The id (str or UUID) of the Simulation to retrieve
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of Simulations returned, as well as which properties and child-objects to fill for the returned Simulations
- Returns:
A Simulation or list of Simulations (depending on whether ‘id’ was specified) with basic properties and child-objects assigned as specified by ‘query_criteria’
- refresh(query_criteria=None)[source]¶
Update properties of an existing Simulation from the server.
- Parameters:
query_criteria – A QueryCriteria object specifying which properties and child-objects to refresh on the Simulation
- save(return_missing_files=False, save_semaphore=None)[source]¶
Save a single Simulation. If it’s a new Simulation, an id is automatically assigned.
- Parameters:
return_missing_files – A boolean that determines the behavior when the Simulation being saved contains a SimulationFile to be saved by md5 checksum (i.e. without uploading the data) that is not yet in COMPS. If true, when there are such files, return an array of UUIDs representing the md5 checksums of the missing files. If false, raise an error when there are any such files.
- static save_all(save_batch_callback=<function Simulation.<lambda>>, return_missing_files=False, save_semaphore=None)[source]¶
Batch-save all unsaved Simulations.
Simulations are saved in batches of at most ‘__max_sim_batch_count’ and with a maximum request size of ‘__max_sim_batch_request_size_kb’.
- Parameters:
save_batch_callback – Callback to call whenever a request to save a batch of Simulations completes. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of Simulations saved so far will be passed when it is called.
return_missing_files – A boolean that determines the behavior when any of the Simulations being saved contains a SimulationFile to be saved by md5 checksum (i.e. without uploading the data) that is not yet in COMPS. If true, when there are such files, return an array of UUIDs representing the md5 checksums of the missing files. If false, raise an error when there are any such files.
- add_file(simulationfile, file_path=None, data=None, upload_callback=<function Simulation.<lambda>>)[source]¶
Add a SimulationFile to a Simulation.
The contents of the file to add can be specified either by providing a path to the file or by providing the actual data as a byte-array. Alternately, if the file/data is already in COMPS, you can skip uploading it again and just provide a SimulationFile that contains the md5 checksum of the data.
If the file exceeds AssetManager.large_asset_upload_threshold bytes in size, the file will be uploaded immediately, separately from the saving of the main Simulation. This allows saving of arbitrarily-large files while avoiding potential timeouts or having to start from scratch in case the upload is interrupted by network issues.
NOTE: providing both file/data and an md5 is considered invalid, as providing the md5 implies the caller knows the file/data is already in COMPS and doesn’t need to be uploaded again.
- Parameters:
simulationfile – A SimulationFile containing the metadata for the file to add.
file_path – The path to the file to add.
data – The actual bytes of data to add.
upload_callback – Callback to call whenever a large file upload completes saving of a chunk of the file. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of bytes saved so far will be passed when it is called.
- retrieve_output_files(paths, job=None, as_zip=False)[source]¶
Retrieve output files associated with this Simulation.
This essentially combines the functionality of retrieve_output_file_info() and retrieve_output_filess_from_info(), and can be used if user doesn’t care about specific metadata related to the files being retrieved.
- Parameters:
paths – Partial paths (relative to the working directory) of the output files to retrieve. If ‘as_zip’ is true, this can be None/empty or not specified, and all output files will be included in the zip returned.
job – The HpcJob associated with the given Simulation to retrieve assets for. If not specified, will default to the last HpcJob chronologically.
as_zip – A boolean controlling whether the output files are returned individually or as a single zip-file (useful for attaching to an e-mail, etc).
- Returns:
If ‘as_zip’ is true, returns a single byte-array of a zip-file; otherwise, returns a list of byte-arrays of the output files retrieved, in the same order as the ‘paths’ parameter.
- retrieve_output_file_info(paths, job=None)[source]¶
Retrieve OutputFileMetadata about output files associated with this Simulation.
- Parameters:
paths – Partial paths (relative to the working directory) of the output files to retrieve. If None/empty or not specified, will default to return all output files.
job – The HpcJob associated with the given Simulation to retrieve output files for. If not specified, will default to the last HpcJob chronologically.
- Returns:
A list of OutputFileMetadata objects for the output files to retrieve, in the same order as the ‘paths’ parameter.
- retrieve_output_files_from_info(metadata, job=None, as_zip=False)[source]¶
Actually retrieve the output files associated with this Simulation.
- Parameters:
metadata – A list of OutputFileMetadata objects representing the output files to retrieve associated with this Simulation.
job – The HpcJob associated with the given Simulation to retrieve output files for. This should match the ‘job’ provided to the retrieve_output_file_info() call. If not specified, will default to the last HpcJob chronologically.
as_zip – A boolean controlling whether the output files are returned individually or as a single zip-file (useful for attaching to an e-mail, etc).
- Returns:
If ‘as_zip’ is true, returns a single byte-array of a zip-file; otherwise, returns a list of byte-arrays of the output files retrieved, in the same order as the ‘paths’ parameter.
COMPS.Data.SimulationFile module¶
- class COMPS.Data.SimulationFile.SimulationFile(file_name, file_type, description='', md5_checksum=None)[source]¶
Bases:
AssetFile
,SerializableEntity
Represents metadata for a Simulation file.
Contains various basic properties accessible by getters:
file_name
file_type
description
md5_checksum
length
uri
‘file_name’, ‘file_type’ and (optionally) ‘description’ must be set on creation.
- property file_type¶
- property description¶
COMPS.Data.Suite module¶
- class COMPS.Data.Suite.Suite(name, description=None, configuration=None)[source]¶
Bases:
TaggableEntity
,CommissionableEntity
,RelatableEntity
,SerializableEntity
Represents a grouping of Experiments.
Contains various basic properties accessible by getters (and, in some cases, +setters):
id
+name
+description
owner
date_created
last_modified
Also contains “child objects” (which must be specifically requested for retrieval using the QueryCriteria.select_children() method of QueryCriteria):
tags
configuration
- property id¶
- property name¶
- property description¶
- property owner¶
- property date_created¶
- property last_modified¶
- property tags¶
- property configuration¶
- classmethod get(id=None, query_criteria=None)[source]¶
Retrieve one or more Suites.
- Parameters:
id – The id (str or UUID) of the Suite to retrieve
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of Suites returned, as well as which properties and child-objects to fill for the returned Suites
- Returns:
A Suite or list of Suites (depending on whether ‘id’ was specified) with basic properties and child-objects assigned as specified by ‘query_criteria’
- refresh(query_criteria=None)[source]¶
Update properties of an existing Suite from the server.
- Parameters:
query_criteria – A QueryCriteria object specifying which properties and child-objects to refresh on the Suite
- get_experiments(query_criteria=None)[source]¶
Retrieve Experiments contained in this Suite.
- Parameters:
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of Experiments returned, as well as which properties and child-objects to fill for the returned Experiments
- Returns:
A list of Experiments with basic properties and child-objects assigned as specified by ‘query_criteria’
COMPS.Data.TaggableEntity module¶
- class COMPS.Data.TaggableEntity.TaggableEntity[source]¶
Bases:
object
- set_tags(tags)[source]¶
Set the tag key/value pairs associated with this entity.
If the entity has any existing tags, they will be replaced by the tags specified. If this is a new entity, tags will not be updated until the entity is saved, otherwise tags are updated immediately.
- Parameters:
tags – A dictionary containing the key/value tag-string pairs to set.
- merge_tags(tags)[source]¶
Merge the given tag key/value pairs with existing tags for this entity.
Any tag keys that already have an existing tag with that key specified for the entity will have their values replaced by the value specified. Any tag keys that don’t already exist for the entity will be added with their specified value.
- Parameters:
tags – A dictionary containing the key/value tag-string pairs to merge.
COMPS.Data.WorkItem module¶
- class COMPS.Data.WorkItem.WorkItem(name, worker, environment_name, description=None, asset_collection_id=None, priority=None)[source]¶
Bases:
TaggableEntity
,CommissionableEntity
,RelatableEntity
,SerializableEntity
Represents a single work-item.
Contains various basic properties accessible by getters (and, in some cases, +setters):
id
+name
+description
owner
date_created
last_modified
state
error_message
worker
environment_name
host_name
worker_instance_id
priority
working_directory
working_directory_size
asset_collection_id
Also contains “child objects” (which must be specifically requested for retrieval using the QueryCriteria.select_children() method of QueryCriteria):
tags
files
plugins
- property id¶
- property name¶
- property worker¶
- property environment_name¶
- property description¶
- property owner¶
- property date_created¶
- property last_modified¶
- property state¶
- property error_message¶
- property host_name¶
- property worker_instance_id¶
- property priority¶
- property working_directory¶
- property working_directory_size¶
- property asset_collection_id¶
- property tags¶
- property files¶
- property plugins¶
- classmethod get(id=None, query_criteria=None)[source]¶
Retrieve one or more WorkItems.
- Parameters:
id – The id (str or UUID) of the WorkItem to retrieve
query_criteria – A QueryCriteria object specifying basic property filters and tag-filters to apply to the set of WorkItems returned, as well as which properties and child-objects to fill for the returned WorkItems
- Returns:
A WorkItem or list of WorkItems (depending on whether ‘id’ was specified) with basic properties and child-objects assigned as specified by ‘query_criteria’
- refresh(query_criteria=None)[source]¶
Update properties of an existing WorkItem from the server.
- Parameters:
query_criteria – A QueryCriteria object specifying which properties and child-objects to refresh on the WorkItem
Get a list of WorkItems related to this WorkItem
- Parameters:
relation_type – A RelationType object specifying which related WorkItems to filter to. If none is specified, all related WorkItems are returned.
Get a list of Suites related to this WorkItem
- Parameters:
relation_type – A RelationType object specifying which related Suites to filter to. If none is specified, all related Suites are returned.
Get a list of Experiments related to this WorkItem
- Parameters:
relation_type – A RelationType object specifying which related Experiments to filter to. If none is specified, all related Experiments are returned.
Get a list of Simulations related to this WorkItem
- Parameters:
relation_type – A RelationType object specifying which related Simulations to filter to. If none is specified, all related Simulations are returned.
Get a list of AssetCollections related to this WorkItem
- Parameters:
relation_type – A RelationType object specifying which related AssetCollections to filter to. If none is specified, all related AssetCollections are returned.
Add a relationship between this WorkItem and a related WorkItem
- Parameters:
related_id – The id (str or UUID) of the related WorkItem
relation_type – The RelationType that describes how this WorkItem is related to the related WorkItem
Add a relationship between this WorkItem and a related Suite
- Parameters:
related_id – The id (str or UUID) of the related Suite
relation_type – The RelationType that describes how this WorkItem is related to the related Suite
Add a relationship between this WorkItem and a related Experiment
- Parameters:
related_id – The id (str or UUID) of the related Experiment
relation_type – The RelationType that describes how this WorkItem is related to the related Experiment
Add a relationship between this WorkItem and a related Simulation
- Parameters:
related_id – The id (str or UUID) of the related Simulation
relation_type – The RelationType that describes how this WorkItem is related to the related Simulation
Add a relationship between this WorkItem and a related AssetCollection
- Parameters:
related_id – The id (str or UUID) of the related AssetCollection
relation_type – The RelationType that describes how this WorkItem is related to the related AssetCollection
- save(return_missing_files=False, save_semaphore=None)[source]¶
Save a single WorkItem. If it’s a new WorkItem, an id is automatically assigned.
- Parameters:
return_missing_files – A boolean that determines the behavior when the WorkItem being saved contains a WorkItemFile to be saved by md5 checksum (i.e. without uploading the data) that is not yet in COMPS. If true, when there are such files, return an array of UUIDs representing the md5 checksums of the missing files. If false, raise an error when there are any such files.
- static save_all(save_batch_callback=<function WorkItem.<lambda>>, return_missing_files=False, save_semaphore=None)[source]¶
Batch-save all unsaved WorkItems.
WorkItems are saved in batches of at most ‘__max_wi_batch_count’ and with a maximum request size of ‘__max_wi_batch_request_size_kb’.
- Parameters:
save_batch_callback – Callback to call whenever a request to save a batch of WorkItems completes. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of WorkItems saved so far will be passed when it is called.
return_missing_files – A boolean that determines the behavior when any of the WorkItems being saved contains a WorkItemFile to be saved by md5 checksum (i.e. without uploading the data) that is not yet in COMPS. If true, when there are such files, return an array of UUIDs representing the md5 checksums of the missing files. If false, raise an error when there are any such files.
- add_work_order(file_path=None, data=None)[source]¶
Add the WorkOrder for a WorkItem.
The contents of the WorkOrder file to add can be specified either by providing a path to the file or by providing the actual data as a string.
- Parameters:
file_path – The path to the work-order file to add.
data – The actual bytes of work-order data to add.
- add_file(workitemfile, file_path=None, data=None, upload_callback=<function WorkItem.<lambda>>)[source]¶
Add a WorkItemFile to a WorkItem.
The contents of the file to add can be specified either by providing a path to the file or by providing the actual data as a byte-array. Alternately, if the file/data is already in COMPS, you can skip uploading it again and just provide a WorkItemFile that contains the md5 checksum of the data.
If the file exceeds AssetManager.large_asset_upload_threshold bytes in size, the file will be uploaded immediately, separately from the saving of the main WorkItem. This allows saving of arbitrarily-large files while avoiding potential timeouts or having to start from scratch in case the upload is interrupted by network issues.
NOTE: providing both file/data and an md5 is considered invalid, as providing the md5 implies the caller knows the file/data is already in COMPS and doesn’t need to be uploaded again.
- Parameters:
workitemfile – A WorkItemFile containing the metadata for the file to add.
file_path – The path to the file to add.
data – The actual bytes of data to add.
upload_callback – Callback to call whenever a large file upload completes saving of a chunk of the file. Default behavior is to print a single ‘.’ to the console. If the callback supplied takes 1 argument, the number of bytes saved so far will be passed when it is called.
- retrieve_output_files(paths, as_zip=False)[source]¶
Retrieve output files associated with this WorkItem.
This essentially combines the functionality of retrieve_output_file_info() and retrieve_output_filess_from_info(), and can be used if user doesn’t care about specific metadata related to the files being retrieved.
- Parameters:
paths – Partial paths (relative to the working directory) of the output files to retrieve. If ‘as_zip’ is true, this can be None/empty or not specified, and all output files will be included in the zip returned.
as_zip – A boolean controlling whether the output files are returned individually or as a single zip-file (useful for attaching to an e-mail, etc).
- Returns:
If ‘as_zip’ is true, returns a single byte-array of a zip-file; otherwise, returns a list of byte-arrays of the output files retrieved, in the same order as the ‘paths’ parameter.
- retrieve_output_file_info(paths)[source]¶
Retrieve OutputFileMetadata about output files associated with this WorkItem.
- Parameters:
paths – Partial paths (relative to the working directory) of the output files to retrieve. If None/empty or not specified, will default to return all output files.
- Returns:
A list of OutputFileMetadata objects for the output files to retrieve, in the same order as the ‘paths’ parameter.
- retrieve_output_files_from_info(metadata, as_zip=False)[source]¶
Actually retrieve the output files associated with this WorkItem.
- Parameters:
metadata – A list of OutputFileMetadata objects representing the output files to retrieve associated with this WorkItem.
as_zip – A boolean controlling whether the output files are returned individually or as a single zip-file (useful for attaching to an e-mail, etc).
- Returns:
If ‘as_zip’ is true, returns a single byte-array of a zip-file; otherwise, returns a list of byte-arrays of the output files retrieved, in the same order as the ‘paths’ parameter.
- class COMPS.Data.WorkItem.WorkerOrPluginKey(name, version)¶
Bases:
tuple
- name¶
Alias for field number 0
- version¶
Alias for field number 1
- class COMPS.Data.WorkItem.WorkItemState(value)[source]¶
Bases:
Enum
An enumeration representing the current state of a WorkItem
- Created = 0¶
- CommissionRequested = 5¶
- Commissioned = 10¶
- Validating = 30¶
- Running = 40¶
- Waiting = 50¶
- ResumeRequested = 60¶
- CancelRequested = 80¶
- Canceled = 90¶
- Resumed = 100¶
- Canceling = 120¶
- Succeeded = 130¶
- Failed = 140¶
COMPS.Data.WorkItemFile module¶
- class COMPS.Data.WorkItemFile.WorkItemFile(file_name, file_type, description='', md5_checksum=None)[source]¶
Bases:
AssetFile
,SerializableEntity
Represents metadata for a WorkItem file.
Contains various basic properties accessible by getters:
file_name
file_type
description
md5_checksum
length
uri
‘file_name’, ‘file_type’ and (optionally) ‘description’ must be set on creation.
- property file_type¶
- property description¶
COMPS.utils package¶
Submodules¶
COMPS.utils.clone_simulation module¶
COMPS.utils.create_asset_collection module¶
COMPS.utils.get_output_files_for_experiment module¶
COMPS.utils.get_output_files_for_workitem module¶
COMPS.utils.get_output_tail module¶
COMPS.utils.get_status module¶
COMPS.utils.main module¶
- class COMPS.utils.main.CustomFormatter(prog, indent_increment=2, max_help_position=24, width=None)[source]¶
Bases:
RawDescriptionHelpFormatter
- class COMPS.utils.main.CustomCommandFormatter(prog, indent_increment=2, max_help_position=24, width=None)[source]¶
Bases:
RawDescriptionHelpFormatter
COMPS.utils.rerun_failed_simulations_for_experiment module¶
Submodules¶
COMPS.AuthManager module¶
- class COMPS.AuthManager.AuthManager(hoststring, verify_certs=False, credential_prompt=None)[source]¶
Bases:
object
Manage authentication to COMPS.
- property username¶
- property hoststring¶
- property groups¶
- property environments¶
- static get_environment_macros(environment_name)[source]¶
Retrieve the environment macros for a COMPS environment.
This may be a somewhat temporary requirement until the Asset Service handles file dependencies more completely (allows uploads, etc).
- Parameters:
environment_name – the COMPS environment to retrieve macros for
- Returns:
a dictionary of environment macro key/value pairs
COMPS.Client module¶
- class COMPS.Client.Client[source]¶
Bases:
object
Client object for managing access to COMPS
- classmethod auth_manager()[source]¶
Retrieve the AuthManager.
Must be logged in first in, otherwise this raises a RuntimeError.
- Returns:
the AuthManager instance
- classmethod login(hoststring, credential_prompt=None)[source]¶
Log in to the COMPS service.
The specified COMPS hoststring allows a couple points of flexibility:
Secure vs. Unsecure - Specifying the protocol as http or https allows the user to control whether the SSL transport is used for requests. By default, https is used.
Port - Specifying a particular port allows the user to control the port to communicate over for requests. By default, the standard port for the chosen protocol is used (i.e. 80 for http, 443 for https).
For example, the following are all valid formats:
comps.idmod.org - uses secure https protocol over port 443.
http://internal.idmod.org - uses unsecure http protocol over port 80.
localhost:54321 - uses secure https protocol over port 54321.
Calling login() when already logged into a different host is invalid and will raise a RuntimeError. When already logged into the same host, nothing is done and the function returns immediately.
- Parameters:
hoststring – the COMPS host to connect to
credential_prompt – a CredentialPrompt object that controls how the user will supply their login credentials. By default, pyCOMPS will try to open a graphical prompt (TKCredentialPrompt) and fall back to console (ConsoleCredentialPrompt) if that fails.
- classmethod logout(hoststring=None)[source]¶
Log out of the COMPS service.
If logged in, this clears any cached credentials and nulls the AuthManager instance. Otherwise, you may pass a hoststring parameter to clear cached credentials for a particular COMPS host.
- Parameters:
hoststring – the COMPS host to clear credentials for
- classmethod post(path, include_comps_auth_token=True, http_err_handle_exceptions=None, **kwargs)[source]¶
- classmethod put(path, include_comps_auth_token=True, http_err_handle_exceptions=None, **kwargs)[source]¶
- classmethod get(path, include_comps_auth_token=True, http_err_handle_exceptions=None, **kwargs)[source]¶
- classmethod delete(path, include_comps_auth_token=True, http_err_handle_exceptions=None, **kwargs)[source]¶
COMPS.CredentialPrompt module¶
- class COMPS.CredentialPrompt.CredentialPrompt[source]¶
Bases:
object
Abstract definition for our credential prompts.
- class COMPS.CredentialPrompt.ConsoleCredentialPrompt[source]¶
Bases:
CredentialPrompt
A simple console based credential prompt
- class COMPS.CredentialPrompt.TKCredentialPrompt[source]¶
Bases:
CredentialPrompt
A TK based credential prompt
Glossary¶
The following terms are used to describe processes, concepts, and the files, features, and functionality related to using COMPS.
- asset collection¶
Collection of user created input files, such as demographics, temperature, weather, and overlay files. These files are stored in COMPS and can be available for use by other users.
- dashboard¶
The COMPS dashboard provides an overview of computing cluster usage, including current and queued jobs. Resource management is simple due to the job-priority system used by the platform.
- experiments¶
Logical grouping of simulations. This allows for managing numerous simulations as a single unit or grouping.
- multi-chart¶
COMPS provides powerful charting functionality to visualize the output channels for simulations. A chart can include output for a single simulation or for multiple simulations. Viewing multiple simulations in a single chart (multi-chart) provides a fast, flexible way to filter simulations to view only data of interest.
- suites¶
Logical grouping of experiments. This allows for managing multiple experiments as a single unit or grouping.
- work item¶
Work item is used to build experiments and suites. It builds a set of simulations or groups of simulations, such as creating parameter sweeps. Work item defines how many simulations run at the start of the experiment to determine if the configuration settings are functional.
- work order¶
JSON formatted file used for the creation of a work item, in combination with a configuration file, and (optional) campaign and additional files.