From 5d26f7e8bf84d5ed055dc1e88996a25c4db80e85 Mon Sep 17 00:00:00 2001 From: Brett Smith Date: Thu, 21 May 2015 16:06:16 -0400 Subject: [PATCH] Update tutorial pipeline page to match new definition. No issue #. --- .../running-external-program.html.textile.liquid | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/doc/user/tutorials/running-external-program.html.textile.liquid b/doc/user/tutorials/running-external-program.html.textile.liquid index 18f5f7d35f..6e2961360d 100644 --- a/doc/user/tutorials/running-external-program.html.textile.liquid +++ b/doc/user/tutorials/running-external-program.html.textile.liquid @@ -21,12 +21,13 @@ This will open the template record in an interactive text editor (as specified b * @"name"@ is a human-readable name for the pipeline. * @"components"@ is a set of scripts or commands that make up the pipeline. Each component is given an identifier (@"bwa-mem"@ and @"SortSam"@) in this example). ** Each entry in components @"components"@ is an Arvados job submission. For more information about individual jobs, see the "job object reference":{{site.baseurl}}/api/schema/Job.html and "job create method.":{{site.baseurl}}/api/methods/jobs.html#create -* @"repository"@, @"script_version"@, and @"script"@ indicate that we intend to use the external @"run-command"@ tool wrapper that is part of the Arvados. These parameters are described in more detail in "Writing a script":tutorial-firstscript.html +* @"repository"@, @"script_version"@, and @"script"@ indicate that we intend to use the external @"run-command"@ tool wrapper that is part of the Arvados. These parameters are described in more detail in "Writing a script":tutorial-firstscript.html. * @"runtime_constraints"@ describes runtime resource requirements for the component. -** @"docker_image"@ specifies the "Docker":https://www.docker.com/ runtime environment in which to run the job. The Docker image @"arvados/jobs-java-bwa-samtools"@ supplied here has the Arvados SDK, Java runtime environment, bwa, and samtools installed. +** @"docker_image"@ specifies the "Docker":https://www.docker.com/ runtime environment in which to run the job. The Docker image @"bcosc/arv-base-java"@ supplied here has the Java runtime environment, bwa, and samtools installed. +** @"arvados_sdk_version"@ specifies a version of the Arvados SDK to load alongside the job's script. * @"script_parameters"@ describes the component parameters. ** @"command"@ is the actual command line to invoke the @bwa@ and then @SortSam@. The notation @$()@ denotes macro substitution commands evaluated by the run-command tool wrapper. -** @"stdout"@ indicates that the output of this command should be captured to a file. +** @"task.stdout"@ indicates that the output of this command should be captured to a file. ** @$(node.cores)@ evaluates to the number of cores available on the compute node at time the command is run. ** @$(tmpdir)@ evaluates to the local path for temporary directory the command should use for scratch data. ** @$(reference_collection)@ evaluates to the script_parameter @"reference_collection"@ @@ -34,7 +35,7 @@ This will open the template record in an interactive text editor (as specified b ** @$(file $(...))@ constructs a local path to a given file within the supplied Arvados collection. ** @$(glob $(...))@ searches the specified path based on a file glob pattern and evalutes to the first result. ** @$(basename $(...))@ evaluates to the supplied path with leading path portion and trailing filename extensions stripped -** @"output_of"@ indicates that the @output@ of the @bwa-mem@ component should be used as the @"input"@ of @SortSam@. Arvados uses these dependencies between components to automatically determine the correct order to run them. +* @"output_of"@ indicates that the @output@ of the @bwa-mem@ component should be used as the @"input"@ script parameter of @SortSam@. Arvados uses these dependencies between components to automatically determine the correct order to run them. When using @run-command@, the tool should write its output to the current working directory. The output will be automatically uploaded to Keep when the job completes. -- 2.30.2