Using /benchmarks

In this example we are going to explore deeply the options available when collecting benchmarks from EOSC Performance. This example was created using Jupyter notebook, click here to the original notebook file.

Create the environment

To do so, we select an API endpoint and collect a token from our configuration. We also need an access token, in this example we use oidc-agent to get one.

[1]:
eosc_perf_api="https://performance.services.fedcloud.eu/api/v1/"
access_token=$(oidc-token egi-prod)

(Conditional) Register, if not done already

To use our service as user, first we need to accept the terms of usage and register. Make sure to read the terms and conditions.

[ ]:
curl -X 'POST' \
  "$eosc_perf_api/users:register" \
  -H "Authorization: Bearer $access_token"

Push a benchmark docker image in a public container repository

All benchmark must rely on a public and accessible container image in a container repository.

You can find a tutorial on how to push a docker container image to Docker Hub here.

From version 1.2.0 benchmarks accept images also outside from docker-hub.

After upload your docker image, you will need the docker_image and docker_tag identifications for the later POST method.

[3]:
image="deephdc/deep-oc-benchmarks_cnn"
tag="gpu"

Design a JSON Schema to accept or discard results from users

Results must be linked to a benchmark when submitted. You can control the required fields and their data types to ensure users do not upload invalid results. This functionality will simplify users to compare attributes between results as such fields will always be present and will share the same type.

If you do not want to use JSON Schemas, you can use {} for an always valid result.

[4]:
schema='{
        "$id": "https://example.com/benchmark.schema.json",
        "$schema": "https://json-schema.org/draft/2019-09/schema",
        "type": "object",
        "properties": {
          "start_datetime": {
            "description": "The benchmark start datetime.",
            "type": "string",
            "format": "date-time"
          },
          "end_datetime": {
            "description": "The benchmark end datetime.",
            "type": "string",
            "format": "date-time"
          },
          "machine": {
            "description": "Execution machine details.",
            "type": "object",
            "properties": {
              "cpus": {
                "description": "Number of CPU.",
                "type": "integer"
              },
              "ram": {
                "description": "Available RAM in MB.",
                "type": "integer"
              }
            },
            "required": [
              "cpus",
              "ram"
            ]
          }
        },
        "required": [
          "start_datetime",
          "end_datetime",
          "machine"
        ]}'

You can learn more about JSON Schemas at json-schema.org.

Upload your benchmark

To upload the benchmark, you only need to use an authenticated POST request to /benchmarks and attach the following content to the body:

  • docker_image: Name of the image in docker hub.

  • docker_tag: Tag of the docker image you want this benchmark to reference.

  • json_schema: Defined JSON Schema to accept community results.

  • description(Optional): Short description about the benchmark for the community users.

[5]:
curl -X 'POST' "$eosc_perf_api/benchmarks" \
  -H 'accept: application/json' \
  -H "Authorization: Bearer $access_token" \
  -H 'Content-Type: application/json' \
  -d "{\"docker_image\": \"$image\", \"docker_tag\": \"$tag\", \"json_schema\": $schema, \
       \"description\": \"A free description for the community\"}" | jq
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  2128  100   853  100  1275    268    400  0:00:03  0:00:03 --:--:--   668
{
  "description": "A free description for the community",
  "docker_image": "deephdc/deep-oc-benchmarks_cnn",
  "docker_tag": "gpu",
  "id": "7722c571-234f-4db3-b4f6-7b8fb869cab0",
  "json_schema": {
    "$id": "https://example.com/benchmark.schema.json",
    "$schema": "https://json-schema.org/draft/2019-09/schema",
    "properties": {
      "end_datetime": {
        "description": "The benchmark end datetime.",
        "format": "date-time",
        "type": "string"
      },
      "machine": {
        "description": "Execution machine details.",
        "properties": {
          "cpus": {
            "description": "Number of CPU.",
            "type": "integer"
          },
          "ram": {
            "description": "Available RAM in MB.",
            "type": "integer"
          }
        },
        "required": [
          "cpus",
          "ram"
        ],
        "type": "object"
      },
      "start_datetime": {
        "description": "The benchmark start datetime.",
        "format": "date-time",
        "type": "string"
      }
    },
    "required": [
      "start_datetime",
      "end_datetime",
      "machine"
    ],
    "type": "object"
  },
  "upload_datetime": "2022-02-22T11:25:25.688828"
}

Download your benchmark

You can check your benchmark is available by downloading the benchmark.

Note the upload of benchmarks needs to be validated by administrators. Therefore it will only be available to the community after the review is completed.

[6]:
benchmarks=$(curl -X 'GET' "$eosc_perf_api/benchmarks?docker_image=$image&docker_tag=$tag")
echo $benchmarks | jq '.items[0].id'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   943  100   943    0     0   131k      0 --:--:-- --:--:-- --:--:--  131k
"7722c571-234f-4db3-b4f6-7b8fb869cab0"