Data Object Service Schemas

Welcome to the documentation for the Data Object Service Schemas! These schemas present an easy-to-implement interface for publishing and accessing data in heterogeneous storage environments. It also includes a demonstration client and server to make creating your own DOS implementation easy!

Schemas for the Data Object Service (DOS) API

The Global Alliance for Genomics and Health is an international coalition formed to enable the sharing of genomic and clinical data. This collaborative consortium takes place primarily via GitHub and public meetings.

Cloud Workstream

The Data Working Group concentrates on data representation, storage, and analysis, including working with platform development partners and industry leaders to develop standards that will facilitate interoperability. The Cloud Workstream is an informal, multi-vendor working group focused on standards for exchanging Docker-based tools and CWL/WDL workflows, execution of Docker-based tools and workflows on clouds, and abstract access to cloud object stores.

What is DOS?

This proposal for a DOS release is based on the schema work of Brian W. and others from OHSU along with work by UCSC. It also is informed by existing object storage systems such as:

The goal of DOS is to create a generic API on top of these and other projects, so workflow systems can access data in the same way regardless of project.

Key features

Data object management

This section of the API focuses on how to read and write data objects to cloud environments and how to join them together as data bundles. Data bundles are simply a flat collection of one or more files. This section of the API enables:

  • create/update/delete a file
  • create/update/delete a data bundle
  • register UUIDs with these entities (an optionally track versions of each)
  • generate signed URLs and/or cloud specific object storage paths and temporary credentials

Data object queries

A key feature of this API beyond creating/modifying/deletion files is the ability to find data objects across cloud environments and implementations of DOS. This section of the API allows users to query by data bundle or file UUIDs which returns information about where these data objects are available. This response will typically be used to find the same file or data bundle located across multiple cloud environments.

Implementations

There are currently a few experimental implementations that use some version of these schemas.

  • DOS Connect observes cloud and local storage systems and broadcasts their changes to a service that presents DOS endpoints.
  • DOS Downloader is a mechanism for downloading Data Objects from DOS URLs.
  • dos-gdc-lambda presents data from the GDC public REST API using the Data Object Service.
  • dos-signpost-lambda presents data from a signpost instance using the Data Object Service.

Quickstart

Installing

Installing is quick and easy. First, it’s always good practice to work in a virtualenv:

$ virtualenv venv
$ source venv/bin/activate

Then, install from PyPI:

$ pip install ga4gh-dos-schemas

Or, to install from source:

$ git clone https://github.com/ga4gh/data-object-service-schemas.git
$ cd data-object-service-schemas
$ python setup.py install

Running the client and server

There’s a handy command line hook for the server:

$ ga4gh_dos_server

and for the client:

$ ga4gh_dos_demo

(The client doesn’t do anything yet but will soon.)

Further reading

  • gdc_notebook.ipynb outlines examples of how to access data with this tool.
  • demo.py demonstrates basic CRUD functionality implemented by this package.

Data Object Service Demonstration Server

DOS Python HTTP Client

Contributor’s Guide

Installing

To install for development, install from source (and be sure to install the development requirements as well):

$ git clone https://github.com/ga4gh/data-object-service-schemas.git
$ cd data-object-service-schemas
$ python setup.py develop
$ pip install -r requirements.txt

Documentation

We use Sphinx for our documentation. You can generate an HTML build like so:

$ cd docs/
$ make html

You’ll find the built documentation in docs/build/.

Tests

To run tests:

$ nosetests python/

The Travis test suite also tests for PEP8 compliance (checking for all errors except line length):

$ flake8 --select=E121,E123,E126,E226,E24,E704,W503,W504 --ignore=E501 python/

Schema architecture

The canonical, authoritative schema is located at openapi/data_object_service.swagger.yaml. All schema changes must be made to the Swagger schema, and all other specifications (e.g. SmartAPI, OpenAPI 3) are derived from it.

Building documents

The schemas are editable as OpenAPI 2 YAML files. To generate OpenAPI 3 descriptions install swagger2openapi and run the following:

$ swagger2openapi -y openapi/data_object_service.swagger.yaml > openapi/data_object_service.openapi.yaml

Releases

New versions are released when ga4gh.dos.__version__ is incremented, a commit is tagged (either through a release or manually), and the tagged branch builds successfully on Travis. When both conditions are met, Travis will automatically upload the distribution to PyPI.

If ga4gh.dos.__version__ is not incremented in a new release, the build may appear to complete successfully, but the package will not be uploaded to PyPI as the distribution will be interpreted as a duplicate release and thus refused.

The process above is currently managed by david4096. To transfer this responsibility, ownership of the PyPI package must be transferred to a new account, and their details added to .travis.yml as described above.

Note that this repository will not become compliant with Semantic Versioning until version 1.0 - until then, the API should be considered unstable.

Documentation is updated independently of this release cycle.

Code contributions

We welcome code contributions! Feel free to fork the repository and submit a pull request. Please refer to this contribution guide for guidance as to how you should submit changes.

Data Object Service Schemas is licensed under the Apache 2.0 license. See LICENSE for more info.

Indices and tables