WRAPP CLI usage

WRAPP provides a command line tool helping with asset packaging and publishing operations for assets stored in Nucleus servers or file systems. It encourages a structured workflow for defining the content of an asset package, and methods to publish and consume those packages in a version-safe manner.

Design

The WRAPP command line tool is a pure Nucleus client utilizing only publicly available APIs, it lives completely in the Nucleus user space. Thus, all operations performed are limited by the permissions granted to the user executing the script.

The tool itself offers a variety of commands that document themselves via the --help command line flag. To get a list of all commands, run

wrapp --help

To get the help for a single command, do e.g.

wrapp create --help

The commands are displayed in alphabetical order, but it is important to understand that the design is based on three layers of increasing abstraction of a pure file-system based workflow. Those layers are:

  1. Files & Folders

  2. Packages

  3. Stages

We will present the commands in the order of the lowest abstraction to the highest abstraction because it makes it easier to understand how the later commands function, but in day to day usage mostly layer 2 and 3 will be used.

Supported URLs

Wrapp in general accesses data through the Omniverse Client-Library and therefore supports URLs to Nucleus servers, S3 buckets, Azure containers/blobs and to the local file system:

Nucleus Servers : Data on Nucleus servers can be accessed using “omniverse://…” URLs. Authentication will by default occur interactively, for more details please refer to the Nucleus documentation.

Azure : Data on Azure can be accessed using “https://…..blob.core.windows.net” URLs. For more details on authentication and requirements on the Azure containers/blobs, please refer to the client-library documentation.

S3 : Data on S3 can be accessed using “http(s)://…cloudfront.net” or “http(s)://…amazonaws.com” URLs. For more details on authentication and requirements on the S3 buckets, please refer to the client-library documentation.

Local file system : Data on the local file system can be accessed using “file://localhost/….” or “file:///…” URLs. Any URL or path that has no scheme is interpreted as a file path, so you can specify file:local_folder or local_folder to address a local directory.

Not all commands support all URL types for all parameters.

Generic parameters

Most if not all commands support the following parameters:

  • --verbose Specify this to have more visibility on what is currently being processed

  • --time Measure the wall clock time the command took to execute

  • --stats Produce some stats about the estimated number of roundtrips and file counts encountered. Note many of these roundtrips may be cached and not actually be executed, this is more of informative nature than a benchmark

  • --jobs Specify the maximum number of parallel jobs executed. The default is 100. This can be useful to throttle load on the server while running bulk operations. Note that downloads are always maxed out at 10 (or use the OMNI_CONN_TRANSFER_MAX_CONN_COUNT setting to set this specifically)

  • --tagging-jobs Specify the maximum number of parallel jobs run on the tagging service. Default is 50.

  • --log-file Specify the name of the log file. The default name is wrapp.log

  • --debug Turn on debug level logging for client library

  • --json-logging Use this to produce a JSON structured log instead of a human readable log

Authentication

By default, wrapp uses interactive authentication which is appropriate for the server and server version your are contacting. It might open a browser window to allow for single sign-on workflows. Successful connections will be cached and no further authentication will be required running commands.

If this is not desired or not possible as in headless programs, the --auth parameter is used to supply credentials. The credentials need to be in the form of a comma separated triplet, consisting of

  1. The server URL. This needs to start with omniverse:// and must match the server name as used in the URLs that target the server.

  2. The username. This can be a regular username or the special name $omni-api-token when the third item is an API token and not a password

  3. The password for that user, or the API token generated for a single sign-on user.

As an example, this is how to specify a wrapp command authenticating against a localhost workstation with the default username and password:

wrapp list-repo omniverse://localhost --auth omniverse://localhost,omniverse,omniverse

and this is how you would use an API token stored in an environment variable on Windows (See API Tokens in the Nucleus documentation):

wrapp list-repo omniverse://staging.nvidia.com/staging_remote/beta_packages --auth omniverse::staging.nvidia.com,$omni-api-token,%STAGING_TOKEN%

on Linux, don’t forget to escape the $.

Runnning wrapp commands concurrently

If several wrapp commands are executed and awaited concurrently, it is strongly recommended to use them in one context created with the CommandContext.run_scheduler method.

Layer 1 commands - Files & Folders and their metadata

Catalog

The catalog command can be used to create a list of files and folders in a specified subtree and store the result together with explicit version information in a catalog (aka manifest) file.

To catalog the content of a specific subtree on your localhost Nucleus with the assets being at the path NVIDIA/Assets/Skies/Cloudy/, just run:

wrapp catalog omniverse://localhost/NVIDIA/Assets/Skies/Cloudy/ cloudy_skies_catalog.json --local-hash

Of course, replace localhost with the server name if the data is somewhere else. The --local-hash is required here only because the data in the example is stored on a mount or if the data is not checkpointed. Use the --local-hash to calculate them on the fly, but the data needs to be downloaded to your local machine!

The json file produced has now archived the files and their versions at the very moment the command was run. Being a server, running the command again might produce a different catalog when files are added, deleted, or updated in the meantime.

To be able to determine if the version that was cataloged is still the same, we can use the diff command to compare two catalogs made at different points in time or even different copy locations of the same asset.

Ignore rules, e.g. for thumbnails

The command supports ignore rules that are by default read from a file called .wrappignore in the current working directory. The name of the ignore file can also be specified with the --ignore-file myignorefile.txt.

For example, to ignore all thumbnail directories for cataloging operation and do not include them in the package, create a file called .wrappignore in your current directory containing the line

.thumbs

Tags

If tags need to be cataloged, copied, and diffed as well, specify the --tags parameter. This will do a second pass using the Omniverse tagging service and will archive the current state of tags, their namespaces and values in the catalog file:

wrapp catalog omniverse://example.nvidia.com/lib/props/vegetation vegetation_tagged.json --tags

Creating a catalog from a file list

Should your asset be structured differently from a simple folder tree that is traversed recursively by the catalog operation, you can create and specify a file list in the form of a tab separated URL list split into the base and the relative path.

As an example, this can be used to create a catalog of an asset structured differently:

omniverse://localhost/NVIDIA/Assets/Skies/Clear/ <TAB> evening_road_01_4k.hdr
omniverse://localhost/NVIDIA/Assets/Skies/Dynamic/ <TAB> Cirrus.usd

If this is stored in a file called input_files.tsv (with a proper ASCII tab character instead of the placeholder), you can create the catalog of this asset with the --file-list parameter like this:

wrapp catalog input_files.tsv evening_road.json --local-hash --file-list

Both files will now be in the root directory of the package to be created, as only the relative part of the path is kept.

Diff

The diff command compares two catalogs, and can be used to find out what has changed or what are the differences between two different copies of the same subtree.

Assuming we have two catalogs of the same package from the same location at two different dates, we can just run

wrapp diff vegetation_catalog_20230505.json vegetation_catalog_20230512.json --show

with the –show command asking not only return if there is a diff (exit code will be 1 if a diff is detected, 0 otherwise) but to even print out a list of items that are only in catalog 1, but not 2, those which are only in 2 but not 1, and a list of files that differs in their content.

Get

Sometimes, it can be handy to have a quick way of retrieving a single file or folder with a command line tool. This is what the get command was made for. To retrieve a single file onto your local disk, just do

wrapp get omniverse://localhost/NVIDIA/Assets/Isaac/2022.1/Isaac/Materials/Isaac/nv_green.mdl

and the tool will download the usd file.

Cat

For viewing the content of a single text file, you can issue the cat command and wrapp will download the content and print it to stdout:

wrapp cat omniverse://localhost/NVIDIA/Assets/Isaac/2022.1/Isaac/Materials/Isaac/nv_green.mdl 

Freeze

The freeze command is used to freeze or archive a specific version into a new location. This is used to make sure a specific version can be reproducibly addressed at that location, e.g. to run a CI job on a specific version or to create a reproducible version for QA testing and subsequent release.

The freeze command has two modes.

The first mode takes a source subtree URL and creates a copy of the head version of the files at the source position. If both source and destination are on the same Nucleus server, the operation is efficient as no data has to be transferred and the files and folders at the new destination are effectively hard links to the same content, causing no data duplication. Note that the history is not copied and the checkpoint numbers will not be the same as in the source.

Here is a command to freeze the example vegetation package at the current head version into a new subtree on the same server:

wrapp freeze omniverse://example.nvidia.com/lib/props/vegetation omniverse://example.nvidia.com/archive/props/vegetation_20230505

The second mode of the command just takes a catalog file as input and again a destination path as second parameter, but needs the flag --catalog.

wrapp freeze vegetation_catalog_20230505.json omniverse://example.nvidia.com/archive/props/vegetation_20230505 --catalog 

Note that while this allows to defer the copy command to a later point and only catalog the files to be used as a first step, there is no guarantee that the freeze operation will still be able to find all files listed in the catalog - they might have been moved away or obliterated. So while creating the catalog first and freezing later is an optimization, be aware that the content in the catalog file is not securely stored.

One useful operation is to specify a local file URL as destination, this allows you to copy out a specific cataloged version out to local disk, e.g. to run a CI job on it

wrapp freeze vegetation_catalog_20230505.json file:/c:/build_jobs/20230505 --catalog

Freeze also supports uses the .wrappignore file like catalog, and also supports the --ignore-file parameter. So even if files are part of the catalog, they can be ignored at freeze stage by providing an ignore file.

To enable respecting tags during the freeze operation and making sure they are copied as well, specify the flag --copy-tags. Note this has no effect when doing a copy within the same Nucleus server, as it will always copy the tags anyhow.

create-patch and apply-patch

The create-patch command uses a three-way comparison to produce a patch file that will merge one file tree into another given their common ancestor. For example, when we have a file tree at time-point 1 and have created a catalog file for this file tree called “catalog_1.json”. We do a copy of this state to a new location and use it from there. Now work continues in the original location, and we create a new catalog at time point 2 called “catalog_2.json”.

If we now want to update the copy of the file tree and our use location, and want to know if it is safe to overwrite the tree or if there are local modifications we want to be alerted about, we use the following steps:

  1. First, catalog also the target location, let’s call this catalog_target.json.

  2. Then, run the following command to produce the patch or delta file which contains the operations needed for the update:

    wrapp create-patch catalog_target.json catalog_2.json catalog_1.json –patch update_target.json

  3. When the command produced the patch file ok, it indicates no local changes have been made and there are no conflicts. Then run the following command to apply the changes in the update_target.json to the target:

    wrapp apply-patch update_target.json

After this command, the file tree at the target location matches the file tree at the source at time point 2. This is the operation that is done by the higher level update command.

In case there are local changes to the target, two options are offered:

  1. To ignore local changes and keeping them, rather just adding new files and new versions where unmodified in the target, specify the --ignore parameter to the merge command.

  2. To rollback changes and lose local changes in the target, specify the --force parameter to the merge command, this will produce a larger patch file containing also the rollback commands.

Layer 2 commands - Packages

So far we have only worked with subtrees like in a versioned file system. This is very powerful and can be used for many use cases, but to have an easier workflow with less complex URLs and fewer possibilities for mistakes, we introduce a few conventions and new commands.

The concept of a repository is known from distributed versioning systems like git, and denotes a location where a repository or module is stored. We use the term repository to point at a directory on a Nucleus server which is used as an intermediate safe storage for the frozen/archived versions, and consumers of these files use that as a copy source.

The package directory is called .packages. Each folder in there represents a named package, and has sub-folders of named versions of that package. No prescriptions are made for how packages or versions have to be named, they just have to be valid file and folder names.

An example package cache could look like this:

  • /.packages

  • /.packages/vegetation_pack

  • /.packages/vegetation_pack/20230505

  • /.packages/vegetation_pack/20230512

  • /.packages/vegetation_pack/20230519

  • /.packages/rocks_pack

  • /.packages/rocks_pack/v1.0.0

  • /.packages/rocks_pack/v1.1.0

Concretely, we introduce the new commands new, create, install, and list-repo.

We allow both named and unnamed packages to be used. Unnamed packages are top level directories that are just consumers of packages produced elsewhere and have no own package description file. Names packages are all packages that have a file .<package name>.wrapp. You can create a named package by using the new command, or create a named package from an unnamed package during the create operation (which will leave the unnamed source package to be unnamed - but you can run new for a directory that already contains files!).

New

The new command does not operate on any files or packages, it rather is a shortcut to create a suitable .<package>.wrapp file to be used by subsequent install-commands.

For instance, when creating a new scenario and wanting to capture the asset packages used, it is useful to have a wrapp.toml (any name is fine) file that will record the dependencies installed.

As an example, just run

wrapp new san_diego_scenario 1.0.0 omniverse://localhost/scenarios/san_diego

This will create a single file .san_diego_scenario.wrapp in the given location.

You can display the contents with

wrapp cat omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp

and it will look similar to this:

{
  "format_version": "1",
  "name": "san_diego_scenario",
  "version": "1.0.0",
  "catalog": null,
  "remote": null,
  "source_url": "omniverse://localhost/scenarios/san_diego",
  "dependencies": null
}

Create

The create command is a shorter form of freeze. The destination directory for the freeze operation always is a package cache directory, which by default is on the same Nucleus server as the source data.

To create a versioned package for reuse from our previous example, run:

wrapp create --package omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp

When you want to later create a new version of this package, just additionally specify the new version:

wrapp create --package omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp --new-version 2.0.0

Alternatively, if you have not run new and there is no .wrapp file in the package directory, you can just specify the name and version directly. This will create a .wrapp file only in the package cache, not in the source of the package:

wrapp create vegetation_pack 20230505 omniverse://localhost/lib/props/vegetation

This will create a copy of the vegetation library in the default package cache at omniverse://localhost/.packages/vegetation_pack/20230505.

You can use the ‘–repo’ option to specify a different downstream Nucleus server to receive the data, but note that this will first download the data and then upload it to the other server. For example, to create the package on a different Nucleus server that is used for staging tests, we could run:

wrapp create vegetation_pack 20230505 omniverse://localhost/lib/props/vegetation --repo omniverse://staging.nvidia.com

This will create a copy of the vegetation library in omniverse://staging.nvidia.com/.packages/vegetation_pack/20230505.

Additionally, this will create a wrapp file recording the package name, the version, and the source from which it was created. The name will be .{package_name}.wrapp. Running the new command to prepare a .wrapp file is optional, create will generate the file in case there is none yet.

Alternatively, packages can be created from previously generated catalogs as well. For this, specify the filename of the catalog file instead of a source URL and add the –catalog option:

wrapp create vegetation_pack 20230505 --catalog vegetation.json --repo omniverse://staging.nvidia.com

List-repo

With the concepts of remotes, you can also list the packages available on any of these. Running

wrapp list-repo omniverse://localhost

would give you the list of known packages with the list of the versions present, for example the output could be

> wrapp list-repo omniverse://localhost
vegetation_pack: 20230505, 20230401

to show you that one package is available, and that in two different versions.

Install

These are still pure file based operations, and when we copy a version of the asset library into a folder with a version name in it, obviously all references to these files would need to be renamed, making it harder to update to a new version of that asset library from within USD.

The idea here is to not reference the package archive directly from within the USD files and materials, but rather to create yet another copy as a subfolder of the scenario or stage, and that subfolder to have no version in its path.

This can most easily be achieved via the install command. Assume the author of a SanDiego scenario stored at omniverse://localhost/scenarios/SanDiego wants to use the vegetation asset pack in a specific version. This can be done with the following command line:

wrapp install vegetation_pack 20230505 omniverse://localhost/scenarios/SanDiego/asset_packs

This will look for the package version in the servers .packages directory, and make a hard linked copy in the specified subdirectory asset_packs/ from where the assets can be imported and used in the scenario scene.

The install command can also be used to update a package at the same location to a different version (actually it also allows to downgrade). For that, just specify a different version number. This command will check if the installed package is unmodified, else it will fail with conflicts (to override, just delete the package at the install location and run install again).

To update the previously installed vegetation_pack to a newer version, just run

wrapp install vegetation_pack 20230523 omniverse://staging.nvidia.com/scenarios/SanDiego/asset_packs

If you use more than one package, it can get quickly complicated to remember which package was installed from where. To help with this, wrapp introduces the concept of package files with dependencies.

To create/update a dependency file, specify an additional parameter to the install command like this:

wrapp install vegetation_pack 20230523 omniverse://staging.nvidia.com/scenarios/SanDiego/asset_packs --package omniverse://staging.nvidia.com/scenarios/SanDiego/.sandiego.wrapp

This will create a file .sandiego.wrapp at the specified location.

If any of the files the install command needs to modify have been manually changed in the installation folder, the installation will fail with an appropriate error message, indicating that the file in the installation folder cannot be updated to match the file in the package folder. This is called a “conflict”. The following examples constitute conflicts:

  • The same file has been changed in the installation folder and the package but with different content.

  • A new file has been added to both the installation folder and the package but with different content.

  • A file has been deleted from the package, but modified in the installation folder.

This conflict mechanism protects the user from losing any data or modifications to the installation folder. To update the installation folder in such a situation, the patch/apply mechanism can be used.

In order to record the conflicts into a patch file, the failed installation can be rerun with an additional parameter specifying the name of the patch file to create. This will apply all non-conflicting changes and record all conflicts in the patch file:

wrapp install vegetation_pack 20230925 omniverse://staging.nvidia.com/scenarios/SanDiego/asset_packs --patch install_conflicts.patch

The install_conflicts.patch file is a json file with the operations that would resolve/override the conflicts. Inspect this and edit or remove operations not desired, and apply with

wrapp apply-patch install_conflicts.patch

Uninstall

Any package that has been installed can be uninstalled again. There are two modes of uninstallation: Via the directory in which the package has been installed, or via pointing to the dependency file which had been used to record the install operation. Then uninstall will also remove the dependency information recorded in that file.

Uninstall via directory:

wrapp uninstall vegetation_pack omniverse://staging.nvidia.com/scenarios/SanDiego/asset_packs

or via package file, no need to specify the installation directory:

wrapp uninstall vegetation_pack --package omniverse://staging.nvidia.com/scenarios/SanDiego/dependencies.toml 

Mirror

When working with multiple servers, it might make sense to transfer created packages (or rather specific versions of these) into the .packages folder on another server so install operations on that server are fast and don’t need to specify the source server as a repository.

This is what the mirror operation is built for - it will copy a package version from one server’s .packages directory into another server’s .packages directory.

The simple format of the command is

wrapp mirror vegetation_pack 20230523 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com

There is the possibility to resume an aborted transfer. This is implemented by cataloging the destination directory first and then calculating and applying a delta patch. Activate this behavior with the --resume parameter. If the destination directory does not exist, this parameter does nothing and is ignored:

wrapp mirror vegetation_pack 20230523 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com --resume

To accelerate upload of subsequent versions, we can force a differential upload versus an arbitrary version that had already been mirrored, just specify the template version as an additional parameter:

    wrapp mirror vegetation_pack 20230623 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com --template-version 20250523 

This will first copy, on the target server, the version specified as template version into the target folder. Then, it will calculate a differential update and only upload and delete files that are changed. This can be a big time saver when many files stayed the same between versions, but will slow doesn things if the difference is actually large because it has to do the additional copy on the destination server and catalog the result of that copy in the destination directory [optimization possible - we could rewrite the source catalog so the subsequent catalog is not required]

Export

Instead of directly copying a package from server to server using the mirror command, you can also have wrapp create a tar file with all contents of a package for a subsequent import operation.

To export, just run

wrapp export vegetation_pack 20230623 --repo omniverse://dev.nvidia.com 

this will download everything to you computer and produce an uncompressed tar file called vegetation_pack.20230623.tar. You can specify an alternative output file name or path with the --output option.

You can also specify a catalog to export using “export –catalog”, e.g.

wrapp export vegetation_pack 20230505 --catalog vegetation.json

This allows creating tar files and packages from arbitrary sources, e.g. data hosted on S3 or Azure. If you plan on importing the data later using the “wrapp import” command, consider using the “–dedup” switch to avoid downloading and storing the same content several times in the tar file.

Import

You might have guessed, an exported package can also be imported again. To do that, run

wrapp import vegetation_pack.20230623.tar --repo omniverse://staging.nvidia.com

to import the package into the .packages folder on the specified receiving repository.