Architecture#
USD Search API is a collection of microservices, which include Search, Embedding, Indexing, and Rendering. These microservices index data on the storage backend (e.g. Amazon S3 or Omniverse Nucleus server). They allow you to query the indexed data using metadata, natural language, and/or image query.

USD Search API architecture diagram that schematically shows the interaction between indexing, embedding, rendering, and API services.#
Here is a breakdown of the process:
OpenUSD assets are first rendered via the Rendering service, which provides 8 previews of a USD asset from different orientations that are pre-determined and are chosen to capture asset appearance from different sides. Furthermore, if there are any cameras pre-set inside the USD asset file, images from those cameras will also be collected.
These previews are then passed to the Embedding service, which extracts NVCLIP embeddings out of them. These embeddings are then stored in an OpenSearch index.
During indexing, 2D assets (such as images and textures) are processed by the Embedding service and the resulting NVCLIP embedding vectors are stored in the OpenSearch index.
Together with NVCLIP embeddings, for each OpenUSD asset, the graph of dependencies and prims is extracted and stored in the Neo4j database by the Asset Graph Service.
A compact version of this architecture is deployed to the NVIDIA API Catalog, with a pre-rendered and pre-indexed database. This is illustrated in the diagram below.

USD Search API architecture diagram for the API catalog. It does not include any indexing and rendering services. Instead it relies on pre-computed search indexes.#