WRAPP Overview

WRAPP provides a command line tool helping with asset packaging and publishing operations for assets stored in S3 buckets, Nucleus servers or file systems in a decentralized way. It encourages a structured workflow for defining the content of an asset package, and methods to publish and consume those packages in a version-safe manner. WRAPP does not require a whole-sale shift of your workflow, but instead helps you to migrate to a structured, version managed workflow.

Below is a diagram illustrating the data flow with generic content files using WRAPP. As a new package is created, that specific content is stored within the package version in the repository. This package can then be installed in new locations as needed.

Alt text

Key points and Assumptions

  • WRAPP does not force you into a defined directory structure, it allows the user to define the package structure.

  • WRAPP does not modify your files or reference paths in your files. It is assumed, if you want to have your reference versions managed by WRAPP, that the references are all relative within the cataloged directory.

  • WRAPP allows for multiple repositories on a single server or across multiple Nucleus servers or S3 buckets.

  • WRAPP allows you to aggregate packages from multiple repositories into “target” directories, and even allows for package dependencies across repositories.

  • WRAPP currently supports repositories on S3 buckets, local file systems and Nucleus servers.

CLI Usage

The WRAPP command line tool utilizes only publicly available APIs for accessing S3 and Nucleus servers. Thus, all operations performed are limited by the permissions granted to the user executing the script.

The tool itself offers a variety of commands that document themselves via the --help command line flag. To get a list of all commands, run

wrapp --help

To get the help for a single command you supply the command like the following example.

wrapp create --help

The commands are displayed in alphabetical order, but the documentation is listed to help guide you from simple usage to get you going and comfortable with pacakge creation and use.

Then we will lead you into the more complex commands to open up the full capabilities of the WRAPP toolset and expanding where WRAPP can help in your pipelines.

Command Organization

Package commands

These are commands that revolve around package creation, installation and management.

  • Primary: The most common commands you will run, giving you quick access to version management with WRAPP.

  • Advanced: These are more advanced commands to define and manage your content packages and how to distribute content with WRAPP.

Utility commands

  • File & Folders and their metadata: These are some very helpful, linux-like, shell commands working on files or folders. They make it easier to view ascii files, list directory trees, download and archive data.

Supported URLs

Wrapp currently supports URLs to Nucleus servers (read/write), S3 buckets (read/write), Azure containers/blobs (only read) and to the local file system (read/write):

Nucleus Servers : Data on Nucleus servers can be accessed using omniverse://... URLs. Authentication will by default occur interactively, if this is not desired please refer to the Nucleus authentication section.

S3 : Data on S3 can be accessed using http(s)://...cloudfront.net, s3://... or http(s)://...amazonaws.com URLs. WRAPP will use boto3 to access S3 when using s3://... or https://...amazonaws.com URLs. http(s)://...cloudfront.net and http://...amazonaws.com URLs are opened via the client-library. For details on authentication, please refer to the S3 authentication section.

WRAPP is primarily tested with S3 general purpose buckets without any additional features enabled.

Azure : Data on Azure can be accessed using https://.....blob.core.windows.net URLs. For more details on authentication and requirements on the Azure containers/blobs, please refer to the client-library documentation.

Local file system : Data on the local file system can be accessed using file://localhost/.... or file:///... URLs. Any URL or path that has no scheme is interpreted as a file path, so you can specify file:local_folder or local_folder to address a local directory.

Not all commands support all URL types for all parameters. In particular, it is not yet possible to copy files from cloud storage directly to cloud storage. If you are in need of an operation like this, please consider using the import/export commands or using the local file system as an intermediate step.

Generic parameters

Most if not all commands support the following parameters:

  • --verbose Specify this to have more visibility on what is currently being processed

  • --time Measure the wall clock time the command took to execute

  • --stats Produce some stats about the estimated number of roundtrips and file counts encountered. Note many of these roundtrips may be cached and not actually be executed, this is more of informative nature than a benchmark

  • --jobs Specify the maximum number of parallel jobs executed. The default is 100. This can be useful to throttle load on the server while running bulk operations. Note that downloads are always maxed out at 10 (or use the OMNI_CONN_TRANSFER_MAX_CONN_COUNT setting to set this specifically)

  • --file-transfer-jobs Specify the maximum number of concurrent file transfers executed. The default is 50.

  • --tagging-jobs Specify the maximum number of parallel jobs run on the tagging service. Default is 50.

  • --log-file Specify the name of the log file. The default name is wrapp.log

  • --debug Turn on debug level logging for client library

  • --json-logging Use this to produce a JSON structured log instead of a human readable log

Nucleus Authentication

By default, for Nucleus Servers, wrapp uses interactive authentication which is appropriate for the server and server version your are contacting. It might open a browser window to allow for single sign-on workflows. Successful connections will be cached and no further authentication will be required running commands.

If this is not desired or not possible as in headless programs, the --auth parameter is used to supply credentials. The credentials need to be in the form of a comma separated triplet, consisting of

  1. The server URL. This needs to start with omniverse:// and must match the server name as used in the URLs that target the server.

  2. The username. This can be a regular username or the special name $omni-api-token when the third item is an API token and not a password

  3. The password for that user, or the API token generated for a single sign-on user.

As an example, this is how to specify a wrapp command authenticating against a localhost workstation with the default username and password:

wrapp list-repo omniverse://localhost --auth omniverse://localhost,omniverse,omniverse

and this is how you would use an API token stored in an environment variable on Windows (See API Tokens in the Nucleus documentation):

wrapp list-repo omniverse://staging.nvidia.com/staging_remote/beta_packages --auth omniverse::staging.nvidia.com,$omni-api-token,%STAGING_TOKEN%

on Linux, don’t forget to escape the $.

For more details on authentication on the Azure containers/blobs and S3 buckets when accessing them via the client-library, please refer to the client-library documentation.

S3 Authentication

When WRAPP directly accesses S3 via boto3, authentication and other configuration is done through the standard boto3 mechanisms. For details, please refer to the boto3 documentation for credentials, and configuration via environment variables and config files.

Package commands - Primary

Packages in wrapp are stored in a repository. The concept of a repository is known from distributed versioning systems like git, and denotes a location where a repository or module is stored. We use the term repository to point at a directory on a Nucleus server which is used as an intermediate safe storage for the frozen/archived versions. Consumers of these files use that as a copy source.

The repository directory is called .packages. Each folder in there represents a named package, and has sub-folders of named versions of that package. No prescriptions are made for how packages or versions have to be named, they just have to be valid file and folder names.

An example package cache could look like this:

  • /.packages

  • /.packages/vegetation_pack

  • /.packages/vegetation_pack/20230505

  • /.packages/vegetation_pack/20230512

  • /.packages/vegetation_pack/20230519

  • /.packages/rocks_pack

  • /.packages/rocks_pack/v1.0.0

  • /.packages/rocks_pack/v1.1.0

To create and manage those packages, we introduce the commands new, create, install, uninstall and list-repo.

We allow both named and unnamed packages to be used. Unnamed packages are top level directories that are just consumers of packages produced elsewhere and have no own package description file. Named packages are packages that have a file .<package name>.wrapp. You can create a named package by using the new command, or create a named package from an unnamed package during the create operation (which will leave the unnamed source package to be unnamed - but you can run new for a directory that already contains files!).

New

The new command does not operate on any files or packages, it rather is a shortcut to create a suitable .<package>.wrapp file to be used by subsequent install-commands.

For instance, when creating a new scenario and wanting to capture the asset packages used, it is useful to have a wrapp.toml (any name is fine) file that will record the dependencies installed.

As an example, just run

wrapp new san_diego_scenario 1.0.0 omniverse://localhost/scenarios/san_diego

This will create a single file .san_diego_scenario.wrapp in the given location.

You can use cat, the utility app to display the contents of the .wrapp file.

wrapp cat omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp

and it should look similar to this:

{
  "format_version": "1",
  "name": "san_diego_scenario",
  "version": "1.0.0",
  "catalog": null,
  "remote": null,
  "source_url": "omniverse://localhost/scenarios/san_diego",
  "dependencies": null
}

Create

The create command creates a versioned package in a WRAPP repository. It will take a snapshot of the content within the supplied directory or catalog and copy it to the version in your repository, so anybody installing the package will be certain to get the correct data. By default, the package is created in the repository in the root directory of the same Nucleus server.

Alt text

To create a versioned package for reuse from our previous example, run:

wrapp create --package omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp

When you want to later create a new version of this package, just additionally specify the new version:

wrapp create --package omniverse://localhost/scenarios/san_diego/.san_diego_scenario.wrapp --new-version 2.0.0

Alternatively, if you have not run new and there is no .wrapp file in the package directory, you can just specify the name and version directly. This will create a .wrapp file only in the package cache, not in the source of the package:

wrapp create vegetation_pack 20230505 omniverse://localhost/lib/props/vegetation

This will create a copy of the vegetation library in the default package cache at omniverse://localhost/.packages/vegetation_pack/20230505.

create –repo

You can use the --repo option to specify a different repository location, this can be a different directory on your Nucleus server, or a different Nucleus server entirely to receive the data. Note that when using a different server, this will first download the data and then upload the data from the original server to the target server. For example, to create the package on a different Nucleus server that is used for staging tests, we could run:

wrapp create vegetation_pack 20230505 omniverse://localhost/lib/props/vegetation --repo omniverse://staging.nvidia.com

This will create a copy of the vegetation library in omniverse://staging.nvidia.com/.packages/vegetation_pack/20230505.

Additionally, this will create a wrapp file recording the package name, the version, and the source from which it was created. The name will be .{package_name}.wrapp. Running the new command to prepare a .wrapp file is optional.

create –catalog

Alternatively, packages can be created from previously generated catalogs as well. For this, specify the filename of the catalog file instead of a source URL and add the --catalog option:

wrapp create vegetation_pack 20230505 --catalog vegetation.json --repo omniverse://staging.nvidia.com

List-repo

You can also list the packages available in a repository using the list-repo command. Running

wrapp list-repo omniverse://localhost

would give you the list of known packages with the list of the versions present in the root repository of the localhost, for example the output could be

> wrapp list-repo omniverse://localhost
vegetation_pack: 20230505, 20230401

This indicates that the package vegetation_pack is available in the repository, in two different versions.

Install

The install command is used to make a versioned package available without having the version explicitly in the paths, making it possible to update to a new package version without updating references to the files in the package.

You can install packages on the same server, different servers, or even your local file system. You can even make multiple versions of packages available to be used at the same time if needed. The idea here is to not reference the package archive directly from within the USD files and materials, but rather to create a light-weight copy as a subfolder of the scenario or stage, and that subfolder to have no version in its path. The version dependency is then maintained with WRAPP.

Alt text

This can most easily be achieved via the install command. Assume the author of a SanDiego scenario stored at omniverse://localhost/scenarios/SanDiego wants to use the vegetation asset pack in a specific version. This can be done with the following command line:

wrapp install vegetation_pack 20230505 omniverse://localhost/scenarios/SanDiego/vegetation

This will look for the package version in the servers .packages directory, and make a hard linked copy in the specified subdirectory vegetation/ from where the assets can be imported and used in the scenario scene.

The install command can also be used to update or downgrade a package at the same location to a different version. For that, just specify a different version number. This command will validate that the installed package is unmodified before installing, else it will fail with conflicts (to override, just delete the package at the install location and run install again, or you can use the create-patch/apply-patch described in the Advanced section).

To update the previously installed vegetation_pack to a newer version, just run

wrapp install vegetation_pack 20231015 omniverse://staging.nvidia.com/scenarios/SanDiego/vegetation

to downgrade is just as simple

wrapp install vegetation_pack 20230505 omniverse://staging.nvidia.com/scenarios/SanDiego/vegetation

install –package

If you use more than one package, it can get quickly complicated to remember which package was installed from which repo. To help with this, wrapp introduces the concept of package files with dependencies.

To create/update a dependency file, specify an additional parameter to the install command like this:

wrapp install vegetation_pack 20231015 omniverse://staging.nvidia.com/scenarios/SanDiego/vegetaion --package omniverse://staging.nvidia.com/scenarios/SanDiego/.sandiego.wrapp

This will create a file .sandiego.wrapp at the specified location.

install –patch

If any of the files the install command needs to modify have been manually changed in the installation folder, the installation will fail with an appropriate error message, indicating that the file in the installation folder cannot be updated to match the file in the package folder. This is called a conflict. The following examples constitute conflicts:

  • The same file has been changed in the installation folder and the package but with different content.

  • A new file has been added to both the installation folder and the package but with different content.

  • A file has been deleted from the package, but modified in the installation folder.

This conflict mechanism protects the user from losing any data or modifications to the installation folder. To update the installation folder in such a situation, the create-patch/apply-patch mechanism can be used.

In order to record the conflicts into a patch file, the failed installation can be rerun with an additional parameter specifying the name of the patch file to create. This will apply all non-conflicting changes and record all conflicts in the patch file:

wrapp install vegetation_pack 20230925 omniverse://staging.nvidia.com/scenarios/SanDiego/vegetation --patch install_conflicts.patch

The install_conflicts.patch file is a json file with the operations that would resolve/override the conflicts. Inspect this and edit or remove operations not desired, and apply with

wrapp apply-patch install_conflicts.patch

install –tags

If you have created packages with tags included, and you are installing into a Nucleus server that has the tagging service running. You can supply the --tags argument for tags to be reimplemented on the installed data.

wrapp install vegetation_pack 20230505 omniverse://staging.nvidia.com/scenarios/SanDiego/vegetation --tags

Uninstall

Any package that has been installed can be uninstalled again. There are two modes of uninstallation: Via the directory in which the package has been installed, or via pointing to the dependency file which had been used to record the install operation. Then uninstall will also remove the dependency information recorded in that file.

Uninstall via directory:

wrapp uninstall vegetation_pack omniverse://staging.nvidia.com/scenarios/SanDiego/vegetation

uninstall –package

You can also uninstall directly from the package file, no need to specify the installation directory:

wrapp uninstall vegetation_pack --package omniverse://staging.nvidia.com/scenarios/SanDiego/.sandiego.wrapp

Package commands - Advanced

Catalog

The catalog command can be used to create a list of files and folders in a specified subtree and store the result together with explicit version information in a catalog (aka manifest) file.

To catalog the content of a specific subtree on your localhost Nucleus with the assets being at the path NVIDIA/Assets/Skies/Cloudy/, just run:

wrapp catalog omniverse://localhost/NVIDIA/Assets/Skies/Cloudy/ cloudy_skies_catalog.json

Of course, replace localhost with the server name if the data is somewhere else.

The json file produced has now archived the files and their versions at the very moment the command was run. Being a server, running the command again might produce a different catalog when files are added, deleted, or updated in the meantime.

To be able to determine if the version that was cataloged is still the same, we can use the diff command to compare two catalogs made at different points in time or even different copy locations of the same asset.

catalog –local-hash

The --local-hash can be used when creating packages from content that is stored on a mount, local file system, or if the data is not checkpointed. This calculates the hashes on the fly, but the data needs to be downloaded to your local machine for this to happen.

wrapp catalog omniverse://localhost/NVIDIA/Assets/Skies/Cloudy/ cloudy_skies_catalog.json --local-hash

catalog –ignore-file

The command supports ignore rules that are by default read from a file called .wrappignore in the current working directory. The name of the ignore file can also be specified with the --ignore-file myignorefile.txt.

For example, to ignore all thumbnail directories for cataloging operation and do not include them in the package, create a file called .wrappignore in your current directory containing the line

.thumbs
*.txt

catalog –tags

If tags need to be cataloged, copied, and diffed as well, specify the --tags parameter. This will do a second pass using the Omniverse tagging service and will archive the current state of tags, their namespaces and values in the catalog file:

wrapp catalog omniverse://example.nvidia.com/lib/props/vegetation vegetation_tagged.json --tags

catalog –file-list

Should your asset be structured differently from a simple folder tree that is traversed recursively by the catalog operation, you can create and specify a file list in the form of a tab separated URL list split into the base and the relative path.

As an example, this can be used to create a catalog of an asset structured differently:

omniverse://localhost/NVIDIA/Assets/Skies/Clear/ <TAB> evening_road_01_4k.hdr
omniverse://localhost/NVIDIA/Assets/Skies/Dynamic/ <TAB> Cirrus.usd

If this is stored in a file called input_files.tsv (with a proper ASCII tab character instead of the placeholder), you can create the catalog of this asset with the --file-list parameter like this:

wrapp catalog input_files.tsv evening_road.json --local-hash --file-list

Both files will now be in the root directory of the package to be created, as only the relative part of the path is kept.

Count

This is a helpful command to understand how many files exist in a catalog, that will be used to create a package, as well as the on-disk size and duplication stats.

wrapp count vegetation_pack.json

Diff

The diff command compares two catalogs, and can be used to find out what has changed or what are the differences between two different copies of the same subtree.

Assuming we have two catalogs of the same package from the same location at two different dates, we can just run

wrapp diff vegetation_catalog_20230505.json vegetation_catalog_20230512.json --show

This will show the files that have been added or changed in vegetation_catalog_20230512 package.

diff –show

The --show command asking not only return if there is a diff (exit code will be 1 if a diff is detected, 0 otherwise) but to even print out a list of items that are only in catalog 1, but not 2, those which are only in 2 but not 1, and a list of files that differs in their content.

Create-patch

The create-patch command uses a three-way comparison to produce a patch file that will merge one file tree into another given their common ancestor. For example, when we have a file tree at time-point 1 and have created a catalog file for this file tree called catalog_week1.json. We do a copy of this state to a new “target” location for use by downstream users. Now work continues in the original location, then we create a new catalog at time point 2 called catalog_week2.json.

If we now want to update the copy of the file tree and our “target” location, and want to know if it is safe to overwrite the tree or if there are local modifications we want to be alerted about, we use the following steps:

  1. First, catalog the “target” location, let’s call this current_target.json.

  2. Then, run the following command to produce the patch or delta file which contains the operations needed for the update:

    wrapp create-patch current_target.json catalog_week2.json catalog_week1.json –patch target_patch.json

  3. When the command returns no differences, empty patch, it indicates no local changes have been made and there are no conflicts.

In case there are local changes to the target, two options are offered:

create-patch –ignore

To keep changes in the “target” location and ignore updating them, specify the --ignore parameter to the create-patch command.

wrapp create-patch current_target.json catalog_week2.json catalog_week1.json --patch target_patch.json --ignore

create-patch –force

To remove the changed files in the “target” location, specify the --force parameter to the create-patch command, this will produce a larger patch file containing the proper patch commands.

wrapp create-patch current_target.json catalog_week2.json catalog_week1.json --patch target_patch.json --ignore

Apply-patch

Then run the following command to apply the changes in the target_patch.json to the target:

wrapp apply-patch target_patch.json  

After this command, the file tree at the “target” location matches the file tree at the source from catalog_week2.json.

Mirror

When working with multiple servers, it might make sense to transfer created packages (or rather specific versions of these) into the .packages folder on another server so install operations on that server are fast and don’t need to specify the source server as a repository.

This is what the mirror operation is built for - it will copy a package version from one server’s .packages directory into another server’s .packages directory.

The simple format of the command is

wrapp mirror vegetation_pack 20230523 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com

mirror –resume

There is the possibility to resume an aborted transfer. This is implemented by cataloging the destination directory first and then calculating and applying a delta patch. Activate this behavior with the --resume parameter. If the destination directory does not exist, this parameter does nothing and is ignored:

wrapp mirror vegetation_pack 20230523 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com --resume

mirror –template-version

To accelerate upload of subsequent versions, we can force a differential upload versus an arbitrary version that had already been mirrored, just specify the template version as an additional parameter:

wrapp mirror vegetation_pack 20230623 --source-repo omniverse://dev.nvidia.com --destination-repo omniverse://staging.nvidia.com --template-version 20250523 

This will first copy, on the target server, the version specified as template version into the target folder. Then, it will calculate a differential update and only upload and delete files that are changed. This can be a big time saver when many files stayed the same between versions, but will slow down things if the difference is actually large because it has to do the additional copy on the destination server and catalog the result of that copy in the destination directory [optimization possible - we could rewrite the source catalog so the subsequent catalog is not required]

Extract-catalog

For use with diff, create-patch, and apply-patch commands. We have the ability to extract a catalog out of a package version

wrapp extract-catalog vegetation_pack 20230623 D:/vegetation_pack_20230623.json --repo omniverse://dev.nvidia.com

Export

Instead of directly copying a package from server to server using the mirror command, you can also have wrapp create a tar file with all contents of a package for a subsequent import operation.

To export, just run

wrapp export vegetation_pack 20230623 --repo omniverse://dev.nvidia.com  

this will download everything to you computer and produce an uncompressed tar file called vegetation_pack.20230623.tar.

export –output

You can specify an alternative output file name or path with the --output option.

wrapp export vegetation_pack 20230623 --repo omniverse://dev.nvidia.com --output D:/vegetation_pack.tar

export –catalog

You can also specify a catalog to export using export --catalog, e.g.

wrapp export vegetation_pack 20230505 --catalog vegetation.json

This allows creating tar files and packages from arbitrary sources, e.g. data hosted on S3 or Azure.

export –dedup

If you plan on importing the data later using the wrapp import command, consider using the --dedup switch to avoid downloading and storing the same content several times in the tar file.

wrapp export vegetation_pack 20230623 --repo omniverse://dev.nvidia.com --output D:/vegetation_pack.tar --dedup

Import

You might have guessed, an exported package can also be imported again. To do that, run

wrapp import vegetation_pack.20230623.tar --repo omniverse://staging.nvidia.com

to import the package into the .packages folder on the specified receiving repository.

Utility commands - Files & Folders and their metadata

Cat

For viewing the content of a single text file, you can issue the cat command and wrapp will download the content and print it to stdout:

wrapp cat omniverse://localhost/NVIDIA/Assets/Isaac/2022.1/Isaac/Materials/Isaac/nv_green.mdl

Get

Sometimes, it can be handy to have a quick way of retrieving a single file or folder with a command line tool. This is what the get command was made for. To retrieve a single file onto your local disk, just do

wrapp get omniverse://localhost/NVIDIA/Assets/Isaac/2022.1/Isaac/Materials/Isaac/nv_green.mdl

and the tool will download the usd file.

Freeze

The freeze command is used to freeze or archive a specific version into a new location. This is used to make sure a specific version can be reproducibly addressed at that location, e.g. to run a CI job on a specific version or to create a reproducible version for QA testing and subsequent release.

The freeze command has two modes.

The first mode takes a source subtree URL and creates a copy of the head version of the files at the source position. If both source and destination are on the same Nucleus server, the operation is efficient as no data has to be transferred and the files and folders at the new destination are effectively hard links to the same content, causing no data duplication. Note that the history is not copied and the checkpoint numbers will not be the same as in the source.

Here is a command to freeze the example vegetation package at the current head version into a new subtree on the same server:

wrapp freeze omniverse://example.nvidia.com/lib/props/vegetation omniverse://example.nvidia.com/archive/props/vegetation_20230505

freeze –catalog

The second mode of the command just takes a catalog file as input and again a destination path as second parameter, but needs the flag --catalog.

wrapp freeze vegetation_catalog_20230505.json omniverse://example.nvidia.com/archive/props/vegetation_20230505 --catalog 

Note that while this allows to defer the copy command to a later point and only catalog the files to be used as a first step, there is no guarantee that the freeze operation will still be able to find all files listed in the catalog - they might have been moved away or obliterated. So while creating the catalog first and freezing later is an optimization, be aware that the content in the catalog file is not securely stored.

One useful operation is to specify a local file URL as destination, this allows you to copy out a specific cataloged version out to local disk, e.g. to run a CI job on it

wrapp freeze vegetation_catalog_20230505.json file:/c:/build_jobs/20230505 --catalog

freeze –ignore-file

Freeze also supports uses the .wrappignore file like catalog, and also supports the --ignore-file parameter. So even if files are part of the catalog, they can be ignored at freeze stage by providing an ignore file.

freeze –tags

To enable respecting tags during the freeze operation and making sure they are copied as well, specify the flag --tags.

Ls

Given an url, ls will return all the files in that subtree. This supports recursive listing and pattern matching.

wrapp ls omniverse://localhost/Projects --recursive --pattern \*.usd --output-format short

Use the --output-format tsv argument to produce a result suitable for a catalog --file-list.

wrapp ls omniverse://localhost/Projects --recursive --output-format tsv > D:/my_projects.tsv
wrapp catalog D:/my_projects.tsv evening_road.json --local-hash --file-list