Skip to content

Latest commit

 

History

History
78 lines (56 loc) · 8.51 KB

ORGANIZER.md

File metadata and controls

78 lines (56 loc) · 8.51 KB

Organizer setup

Here are the instructions for somebody who wants to organize a robotics simulation benchmark. The links in the rest of the setup are relative to the repository where this file is, so to be able to use them you should first create your own repository from this template and go to its ORGANIZER.md file to continue reading the instructions.

You will then need to follow those steps (remember that you can open a link in a new tab by middle-clicking the link):

GitHub settings

  1. Go to the Settings tab:
    1. Under the General section, tick the "Template repository" box so that the competitors can easily make a copy of the simulation files.
  2. You will need to setup a GitHub secret to be able to fetch your competitors' controllers:
    1. Create a new Personal Access Token. Give it a name to remember what it is for and set its "Expiration" to the end of the tournament. You can always set it to "No expiration" or recreate a token when it expires to allow the automated scripts to continue working. Tick the "repo" scope box, scroll down to the "Generate token" button and click it. Copy the generated code to your clipboard.
    2. Go to the repo's secrets settings to create a new repository secret. Name it "REPO_TOKEN". In the "Secret" text area, paste the Personal Access Token you just created and finally click the "Add secret" button.
  3. You will also need to add three custom labels for the automation scripts: "registration", "pending" and "accepted"
    1. Go to the Generate new labels action page under the Actions tab. Click on "Run workflow" to create automatically the needed labels. It may take a few seconds to complete the workflow.

Webots files

  1. Replace/add all the files needed for your Webots simulation at the root of the repository, notably the folders:

    • worlds for your Webots scenario
    • controllers for your robot and supervisor controllers
    • plugins for the HTML robot windows
    • protos if you need extra PROTOs
  2. Make sure that inside the world file the supervisor node has the "synchronization" field set to TRUE and the Robot node has its "synchronization" field set to FALSE.

    • Note that on webots.cloud, the listing title of the benchmark and its hover description are defined in the Webots world file: more specifically, the WorldInfo node has a "title" and an "info" field which are parsed when submitting the world file to webots.cloud.
  3. In order for the automated script to recover the competitors' score correctly, the supervisor needs to print the final performance of the robot controller in the format "performance:SCORE" to stdout (only the SCORE part needs to be changed, which should be a float number). The score unit depends on the metric used for the benchmark which will be defined in webots.yml that you will need to edit in the next step.

Supported metrics

name description score value
percent ranks users based on how close they are to a given objective a value between 0 and 1
time-speed ranks users based on how quickly they complete the objective a time in seconds
time-duration ranks users based on how long they manage to perform a task (e.g., to maintain an inverted pendulum upright) a time in seconds
distance ranks users based on how far they manage to move something (including themselves) a distance in meters

Benchmark specific files

  1. Update the parameters inside webots.yml:
    • file: set the relative path to your world file.
    • maximum-duration: the maximum duration of an evaluation in seconds. Set it not too large to avoid long evaluations of broken controllers but not too short to have enough time to finish the task.
    • metric: defines the metric used for the benchmark. Use one of the values defined in the metric table.
    • dockerCompose: it is a special path used by the integrated IDE and GitHub actions to locate the default robot controller. Change "edit_me" to the name of your main robot controller.
    1. Don't forget to commit your changes to save them.
  2. When a controller is evaluated, Webots and the controller are run inside Docker containers. There are two Dockerfiles at the root of the repository, Dockerfile for the Webots container and controller_Dockerfile for the controller container which contains the setup of the competitor. The default Dockerfile will launch in one docker a standard version of Webots with the world file defined in the webots.yml file. The default controller_Dockerfile will launch in another docker, a python robot controller specified in webots.yml that will communicate with the Webots process running in the first docker.
    • The default webots.cloud Docker image already has the tools needed to compile and run C, C++ and Python controllers. However, if you need a special environment (for example with specific libraries) for your simulation or supervisor controller you can configure the main Dockerfile as needed. Similarly, if competitors have special dependencies (like ROS 2, or some specific Python libraries) for their robot controllers, they will be able to configure their controller_Dockerfile accordingly.
  3. Replace the three files of the preview folder with an example animation of your benchmark recorded from Webots. Keep the same names for the files: animation.json, scene.x3d and thumbnail.jpg.

README update

Some sections from the README file are used to generate the webots.cloud benchmark page: the title, the description and an information table. Make sure to edit them while keeping them inside their respective <span> tags.

Update the README file:

  1. Change the title and the description section to describe your new scenario.
  2. Update the different fields of the information section:
    • Difficulty: an idea of the benchmark's complexity (for example: Middle School, High School, Bachelor, Master, PhD...)
    • Robot: the name of the robot used in the benchmark
    • Language: the programming language of the example controller
    • Commitment: an idea of the time required to complete the benchmark (a few minutes, a couple of hours, a couple of days...)
  3. Replace the two occurrences of "ORGANIZER_NAME" in the "How to participate" section with your GitHub username and one "ORGANIZER_REPOSITORY" with your repository name.
  4. Remove the "Organizer setup" section at the top of the file.
  5. Don't forget to commit your changes to save them.

Webots.cloud submission

You can now submit your benchmark to webots.cloud to share it with other people. On the website, in the "Benchmark" tab, click on "Add a new benchmark" and enter the URL to your .wbt world file located in the worlds folder.

When you have submitted your benchmark to webots.cloud, change the link of the shield badge at the top of the README file to your own webots.cloud page. You will then be able to easily go to the webots.cloud site to see your updated changes and your competitors will have a handy link to the leaderboard. This link is also used in the automated messages to your participants so make sure it points to the right page.

Final test

To see if your repository is correctly configured, copy the URL of your repository and register it to itself. The registration should work without any errors. If that is not the case, check the actions logs for clues on how to solve the problem.

Finally, once you completed all the previous steps, you can delete this file and your benchmark should be good to go!