hpc05.client module

class hpc05.client.Client(profile='pbs', hostname='hpc05', username=None, password=None, culler=True, culler_args=None, env_path=None, local=False, *args, **kwargs)[source]

Bases: ipyparallel.client.client.Client

Return an ipyparallel.Client and connect to a remote ipcluster over ssh if local=False and start the engine culler.

Parameters
  • profile (str) – profile name, default is ‘pbs’ which results in the folder ipython/profile_pbs.

  • hostname (str) – Hostname of machine where the ipcluster runs. If connecting via the headnode use: socket.gethostname() or set local=True.

  • username (str) – Username to log into hostname. If not provided, it tries to look it up in your ssh/config.

  • password (str) – password for ssh username@hostname.

  • culler (bool) – Controls starting of the culler. Default: True.

  • culler_args (str) – Add arguments to the culler. e.g. ‘–timeout=200’

  • env_path (str, default: None) – Path of the Python environment, ‘/path/to/ENV/’ if Python is in /path/to/ENV/bin/python. Examples ‘~/miniconda3/envs/dev/’, ‘miniconda3/envs/dev’, ‘~/miniconda3’. Defaults to the environment that is sourced in bashrc or bash_profile.

  • local (bool, default: False) – Connect to the client locally or over ssh. Set it False if a connection over ssh is needed.

json_filename

file name of tmp local json file with connection details.

Type

str

tunnel

ssh tunnel for making connection to the hpc05.

Type

pexpect.spawn object

Notes

You need a profile with PBS (or SLURM) settings in your ipython folder on the cluster. You can generate this by running:

hpc05.create_remote_pbs_profile(username, hostname)

Then setup a ipcluster on the hpc05 by starting a screen and running

ipcluster start –n=10 –profile=pbs.

hpc05.client.get_culler_cmd(profile='pbs', env_path=None, culler_args=None)[source]