5 |
--------------------------------------------------------------------------------
/Diagram.rst:
--------------------------------------------------------------------------------
1 | ***************************
2 | Diagram of REST operations
3 | ***************************
4 |
5 |
6 | .. image:: RESTful_HDF5.png
7 | :width: 100 %
8 | :alt: alternate text
9 | :align: right
--------------------------------------------------------------------------------
/index.md:
--------------------------------------------------------------------------------
1 | HDF REST API Developer Documentation
2 | ====================================
3 |
4 | This is the developer documentation for the HDF REST API.
5 |
6 | Contents:
7 |
8 | Indices and tables
9 | ==================
10 |
11 | - genindex
12 | - modindex
13 | - search
14 |
15 |
--------------------------------------------------------------------------------
/Reference.rst:
--------------------------------------------------------------------------------
1 | ###################
2 | Reference
3 | ###################
4 |
5 | .. toctree::
6 | :maxdepth: 2
7 |
8 | Authorization
9 | CommonRequestHeaders
10 | CommonResponseHeaders
11 | CommonErrorResponses
12 | Diagram
13 | Hypermedia
14 | Resources
15 | UsingIteration
16 |
17 |
18 |
--------------------------------------------------------------------------------
/.github/workflows/validate.yml:
--------------------------------------------------------------------------------
1 | # This workflow will do a clean install of node dependencies, cache/restore them, build the source code and run tests across different versions of node
2 | # For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions
3 |
4 | name: Node.js CI
5 |
6 | on: [push, workflow_dispatch]
7 |
8 | jobs:
9 | validateOpenApiSpec:
10 |
11 | runs-on: ubuntu-latest
12 |
13 | steps:
14 | - uses: actions/checkout@v2
15 | - name: Use Node.js
16 | uses: actions/setup-node@v2
17 | with:
18 | node-version: 16
19 | - run: npm install -g ibm-openapi-validator
20 | - run: lint-openapi openapi.yaml
21 |
--------------------------------------------------------------------------------
/CommonResponseHeaders.md:
--------------------------------------------------------------------------------
1 | Common Response Headers
2 | =======================
3 |
4 | The following describes some of the common response lines returned by the HDF REST API.
5 |
6 | > - Status Line: the first line of the response will always be: "`HTTP/1.1`" followed by
7 | > a status code (e.g. 200) followed by a reason message (e.g. "`OK`"). For errors, an additional error message may be included on this line.
8 | >
9 | > - Content-Length: the response size in bytes.
10 | > - Etag: a hash code that indicates the state of the requested resource. If the client
11 | > sees the same Etag value for the same request, it can assume the resource has not changed since the last request.
12 | >
13 | > - Content-Type: the mime type of the response.
14 |
15 |
--------------------------------------------------------------------------------
/index.rst:
--------------------------------------------------------------------------------
1 | .. HDF REST API documentation master file, created by
2 | sphinx-quickstart on Mon Jul 9 10:29:28 2018.
3 | You can adapt this file completely to your liking, but it should at least
4 | contain the root `toctree` directive.
5 |
6 | HDF REST API Developer Documentation
7 | ====================================
8 |
9 | This is the developer documentation for the HDF REST API.
10 |
11 | Contents:
12 |
13 | .. toctree::
14 | :maxdepth: 2
15 |
16 | Introduction/index
17 | DomainOps/index
18 | GroupOps/index
19 | DatasetOps/index
20 | DatatypeOps/index
21 | AttrOps/index
22 | Types/index
23 | AclOps/index
24 | Reference
25 |
26 |
27 |
28 | Indices and tables
29 | ==================
30 |
31 | * :ref:`genindex`
32 | * :ref:`modindex`
33 | * :ref:`search`
34 |
--------------------------------------------------------------------------------
/CommonResponseHeaders.rst:
--------------------------------------------------------------------------------
1 | ***************************
2 | Common Response Headers
3 | ***************************
4 |
5 | The following describes some of the common response lines returned by the HDF REST API.
6 |
7 | * Status Line: the first line of the response will always be: "``HTTP/1.1``" followed by
8 | a status code (e.g. 200) followed by a reason message (e.g. "``OK``"). For errors,
9 | an additional error message may be included on this line.
10 |
11 | * Content-Length: the response size in bytes.
12 |
13 | * Etag: a hash code that indicates the state of the requested resource. If the client
14 | sees the same Etag value for the same request, it can assume the resource has not
15 | changed since the last request.
16 |
17 | * Content-Type: the mime type of the response.
18 |
19 |
--------------------------------------------------------------------------------
/Introduction/index.rst:
--------------------------------------------------------------------------------
1 | ###################
2 | Introduction
3 | ###################
4 |
5 | The HDF REST API is an interface to HDF5 data stored on network-based architectures.
6 | The HDF REST API has provisions to support CRUD (create, read, update and delete) operations on
7 | the full spectrum of HDF5 objects including: groups, links, datasets, attributes, and committed
8 | data types. See https://support.hdfgroup.org/pubs/papers/RESTful_HDF5.pdf for more information on the
9 | RESTful HDF5 interface.
10 |
11 | As a REST-based API, implementations can be developed in a variety of languages, such as
12 | JavaScript, Python, C, and other common languages. Some services which implement the HDF REST API are:
13 |
14 | HDF Kita: https://www.hdfgroup.org/hdf-kita
15 |
16 | H5serv: https://github.com/HDFGroup/h5serv
17 |
18 | HDF5 REST VOL: https://bitbucket.hdfgroup.org/users/jhenderson/repos/rest-vol/browse
--------------------------------------------------------------------------------
/make.bat:
--------------------------------------------------------------------------------
1 | @ECHO OFF
2 |
3 | pushd %~dp0
4 |
5 | REM Command file for Sphinx documentation
6 |
7 | if "%SPHINXBUILD%" == "" (
8 | set SPHINXBUILD=sphinx-build
9 | )
10 | set SOURCEDIR=.
11 | set BUILDDIR=_build
12 | set SPHINXPROJ=HDFRESTAPI
13 |
14 | if "%1" == "" goto help
15 |
16 | %SPHINXBUILD% >NUL 2>NUL
17 | if errorlevel 9009 (
18 | echo.
19 | echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
20 | echo.installed, then set the SPHINXBUILD environment variable to point
21 | echo.to the full path of the 'sphinx-build' executable. Alternatively you
22 | echo.may add the Sphinx directory to PATH.
23 | echo.
24 | echo.If you don't have Sphinx installed, grab it from
25 | echo.http://sphinx-doc.org/
26 | exit /b 1
27 | )
28 |
29 | %SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
30 | goto end
31 |
32 | :help
33 | %SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
34 |
35 | :end
36 | popd
37 |
--------------------------------------------------------------------------------
/CommonRequestHeaders.md:
--------------------------------------------------------------------------------
1 | Common Request Headers
2 | ======================
3 |
4 | The following describe common HTTP request headers as used in the HDF REST API:
5 |
6 | > - Request line: The first line of the request, the format is of the form HTTP verb (GET, PUT, DELETE, or POST) followed by the path to the resource (e.g. /group/<uuid>. Some operations take one or more query parameters (see relevant documentation)
7 | > - Accept: Specifies the media type that is acceptable for the response. Valid values are "application/json", and "*/*. In addition, GET/PUT/POST Value (see DatasetOps/GET\_Value, DatasetOps/PUT\_Value, DatasetOps/POST\_Value) support the value "application/octet-stream"
8 | > - Authorization: A string that provides the requester's credentials for the request. See Authorization
9 | > - Host: the domain (i.e. related collection of groups, datasets, and attributes) that the request should apply to
10 | >
11 | > Note: the host header can also be provided as a query parameter. Example: | Resource | 19 |GET | 20 |PUT | 21 |POST | 22 |DELETE | 23 |Description | 24 |
|---|---|---|---|---|---|
| Domain | 29 |30 | |
32 | 33 | |
35 | 36 | |
38 | 39 | |
41 | A related collection of HDF objects | 42 |
| Group | 45 |46 | |
48 | 49 | |
51 | 52 | |
54 | 55 | |
57 | Represents an HDF Group | 58 |
| Links | 61 |62 | |
64 | 65 | |
67 | 68 | |
70 | 71 | |
73 | Collection of links within a group | 74 |
| Link | 77 |78 | |
80 | 81 | |
83 | 84 | |
86 | 87 | |
89 | Represents an HDF link | 90 |
| Dataset | 93 |94 | |
96 | 97 | |
99 | 100 | |
102 | 103 | |
105 | Represents an HDF Dataset | 106 |
| Attributes | 109 |110 | |
112 | 113 | |
115 | 116 | |
118 | 119 | |
121 | Collection of Attributes | 122 |
| Attribute | 125 |126 | |
128 | 129 | |
131 | 132 | |
134 | 135 | |
137 | Represents an HDF Attribute | 138 |
| Dataspace | 141 |142 | |
144 | 145 | |
147 | 148 | |
150 | 151 | |
153 | Shape of a dataset | 154 |
| Type | 157 |158 | |
160 | 161 | |
163 | 164 | |
166 | 167 | |
169 | Type of a dataset | 170 |
| Value | 173 |174 | |
176 | 177 | |
179 | 180 | |
182 | 183 | |
185 | Data values of a datset | 186 |
| Datatype | 189 |190 | |
192 | 193 | |
195 | 196 | |
198 | 199 | |
201 | Committed datatype | 202 |
| Groups | 205 |206 | |
208 | 209 | |
211 | 212 | |
214 | 215 | |
217 | Collection of groups within a domain | 218 |
| Datasets | 221 |222 | |
224 | 225 | |
227 | 228 | |
230 | 231 | |
233 | Collection of datasets within a domain | 234 |
| Datatypes | 237 |238 | |
240 | 241 | |
243 | 244 | |
246 | 247 | |
249 | Collection of datatypes within a domain | 250 |