GRAAL.md
The problem we want to solve is detect regressions in both Micronaut and GraalVM as soon as they are introduced and before a new version is released.
We use Gitlab CI for running the tests.
We could configure the CI pipeline to run the tests for every commit in Micronaut (configuring a webhook), but we don't have permissions to configure the webhook in GraalVM repository. Also, running all the test suite takes about 30 minutes, and it uses a lot of resources. Instead of running all the tests for every commit there are scheduled jobs that run them every work day if there are new commits in either Micronaut or GraalVM repositories. When that happens, the jobs trigger a CI build.
There are two repositories:
It is possible to test different Micronaut and GraalVM versions. For Micronaut, we always test the current stable version and the next one (using snapshots in both cases). In the case of GraalVM we can test up to three different branches at the same time: stable, prerelease and development, and we always test at least two of them.
Currently, we have:
3.2.x-SNAPSHOT and 3.3.0-SNAPSHOT21.3.0), prerelease (22.0.0-dev) and development (22.1.0-dev)These are the current branches:
3.2.x-stable: Micronaut 3.2.x-SNAPSHOT and GraalVM 21.3.0. To test that current stable versions of Micronaut work properly with current GraalVM stable version.3.3.x-stable: Micronaut 3.3.x-SNAPSHOT and GraalVM 21.3.0. To test that next Micronaut version works with current GraalVM stable version.3.3.x-prerelease: Micronaut 3.3.x-SNAPSHOT and GraalVM 22.0.0-dev from the prerelease branch. To test the next Micronaut version works with the next GraalVM from the prerelease branch.3.3.x-dev: Micronaut 3.3.x-SNAPSHOT and GraalVM 22.1.0-dev from master branch. To test the next Micronaut version works with the future GraalVM version.When GraalVM 22.0.0-dev becomes the next stable version, i.e, 22.0.0 the new branches should be:
3.3.x-stable: Micronaut 3.3.x-SNAPSHOT and GraalVM 21.2.0.3.3.x-dev: Micronaut 3.3.x-SNAPSHOT and GraalVM 22.1.0-dev from master branch3.4.x-dev: Micronaut 3.4.x-SNAPSHOT and GraalVM 22.1.0-dev from master branchThen, approximately a month before the next GraalVM version scheduled for April 19th 2022
the GraalVM team will create the next prerelease branch release/graal-vm/22.1 and we need to create the new branch and a
new scheduled job.
All the configuration is in this repository and
the README explains how it works. The only thing that we need to modify is the file should-trigger-the-build.sh to
use the appropriate Micronaut and GraalVM branches.
Important: When adding commits to more than one branch, always cherry-pick the changes instead of merging branches. It helps a lot of keeping the branches "clean" without merges.
The CI pipeline is configured in four stages:
log-commits: It only contains one job that logs the previous Micronaut and GraalVM commits and the new that triggered the build.build-graal: One job per JDK version (currently 11 and 17) that clones GraalVM repository and builds it from source code. For the GraalVM stable versions it just downloads it.micronaut: It contains one job per test application and JDK version. These are Micronaut applications built with GraalVM that we know are compatible and work with GraalVM. Every job builds the native-image for the associated application.test: There is one job per test application and JDK version. Every job starts the native-image created in the previous stage, run some functional tests and checks the result.log-commits:
image: alpine:3.8
stage: log-commits
script:
- ./log-commits.sh # <1>
.build-graalvm:template: &build-graalvm-template # <1>
stage: build-graalvm
dependencies:
- log-commits
needs: ["log-commits"]
artifacts:
expire_in: 5 days
paths:
- $CI_PROJECT_DIR/graal_dist # <3>
cache:
key: ${GRAAL_NEW_COMMIT}-${CI_JOB_NAME} # <2>
paths:
- $CI_PROJECT_DIR/graal_dist # <2>
tags: # <4>
- aws
- speed2x
- memory2x
jdk11:build-graalvm:
<<: *build-graalvm-template # <5>
script:
- if [ -d $CI_PROJECT_DIR/graal_dist ]; then exit 0; fi # <6>
- ./build-graalvm.sh jdk11 # <7>
jdk17:build-graalvm:
<<: *build-graalvm-template
script:
- if [ -d $CI_PROJECT_DIR/graal_dist ]; then exit 0; fi
- ./build-graalvm.sh jdk17
build-graalvm-template that other jobs can extend from.paths option defines the output of the GraalVM compilation and that is what will be cached.artifact path defines all the files that are passed automatically to the next stage in the pipeline. They expire automatically (meaning they are removed) after 5 days. Gitlab CI will save the artifacts and download them automatically in the next stage. With this configuration the GraalVM SDK we just built is available to create Micronaut native-images.aws tag run on our own custom runners on AWS. Additionally, there are more tags necessary to define in which instance type the job is run. The tags will be explained bellow.The structure of all the jobs in this stage is the same:
.micronaut:build-template: µnaut-build-template # <1>
stage: micronaut
image: registry.gitlab.com/micronaut-projects/micronaut-graal-tests/graalvm-builder # <2>
before_script:
- export APP_BRANCH=$(echo $CI_BUILD_REF_NAME | sed "s/-dev//" | sed "s/-stable//" | sed "s/-prerelease//") # <3>
artifacts:
expire_in: 5 days
allow_failure: true # <4>
retry:
max: 2 # <5>
when:
- always
.jdk11:micronaut-build: &jdk11-build # <6>
<<: *micronaut-build-template
dependencies:
- jdk11:build-graalvm
needs: ["jdk11:build-graalvm"]
.jdk17:micronaut-build: &jdk17-build
<<: *micronaut-build-template
dependencies:
- jdk17:build-graalvm
needs: ["jdk17:build-graalvm"]
jdk11:basic-app:micronaut-build:
<<: *jdk11-build # <7>
artifacts:
paths:
- $CI_PROJECT_DIR/micronaut-basic-app/basic-app # <7>
script:
- ./build-basic-app.sh # <8>
tags: # <9>
- aws
- speed
jdk17:basic-app:micronaut-build:
<<: *jdk17-build
artifacts:
paths:
- $CI_PROJECT_DIR/micronaut-basic-app/basic-app
script:
- ./build-basic-app.sh
tags:
- aws
- speed
micronaut stage.-dev, -stable and -prerelease from the current branch. The environment variable APP_BRANCH is used in every test application build script to check out the appropriate git branch..micronaut:test-template: µnaut-test-template # <1>
stage: test
image: frolvlad/alpine-glibc:alpine-3.12 # <2>
before_script:
- ./test-before-script.sh # <3>
timeout: 20m
retry:
max: 1
.micronaut:test-distroless-template: µnaut-test-distroless-template # <4>
<<: *micronaut-test-template
image:
name: gcr.io/distroless/cc-debian10:debug # <5>
entrypoint: [ "" ]
before_script:
- ./test-before-script-distroless.sh # <6>
jdk11:basic-app:test:
<<: *micronaut-test-distroless-template # <7>
dependencies:
- jdk11:basic-app:micronaut-build
needs: ["jdk11:basic-app:micronaut-build"]
script:
- ./test-basic-app.sh # <8>
jdk17:basic-app:test:
<<: *micronaut-test-distroless-template
dependencies:
- jdk17:basic-app:micronaut-build
needs: ["jdk17:basic-app:micronaut-build"]
script:
- ./test-basic-app.sh
test stage that build dynamic native images.frolvlad/alpine-glibc:3.12 Docker image to run the native-image applications.curl, jq and libstdc++.test stage that build "mostly static" native images.curl and jq.basic-app is generated as "mostly static" native image, so apply that parent template.For more information about Gitlab CI see https://docs.gitlab.com/ee/ci/.
There is a channel in OCI's chat named micronaut-graal-tests that gets notifications when jobs are executed.
The tests applications are in the GitHub organization micronaut-graal-tests. All the applications have similar structure, and they test different Micronaut integrations that we know work properly with GraalVM.
There are two different branch names strategies in the test applications:
3.3.x, 3.2.x, 3.1.x,...3.3.x_h2, 3.3.x_mysql, 3.3.x_postgres, 3.3.x_thymeleaf, 3.3.x_handlebars, 3.3.x_v3, 3.3.x_v5,...These are the steps to add a new Micronaut-GraalVM test application to the pipeline:
Create the new test application repository in https://github.com/micronaut-graal-tests.
Do not use master branch. Just add a common README there like the one in basic-app or data-jdbc.
Create the appropriate branch for the Micronaut version we want to target, e.g: 3.3.x.
Create a build-native-image.sh script similar to the one in the other applications.
Add a README with the curl endpoints needed to test the application.
Create a new branch in https://gitlab.com/micronaut-projects/micronaut-graal-tests and modify gitlab-ci.yml file:
micronaut and test stages.git push -o ci.skip3.3.x-stable branch, then cherry-pick the commit that adds the new test applications to branches 3.3.x-dev and 3.3.x-prerelease.We use custom AWS runners with auto-scaling configured. Everything is based on the official documentation https://docs.gitlab.com/runner/configuration/runner_autoscale_aws/.
This kind of configuration only needs one instance running 24x7 that will handle the auto-scaling of the rest of the instances used to run the tests. This instance doesn't need to be too powerful and at this moment it is a t3a.small (2 vCPU and 2 GB RAM).
It is important to note that the auto-scaling uses Docker Machine under the hood to start the new instances. Docker Machine is not maintained anymore so the Gitlab team created a fork that they maintain with critical fixes. At this moment we are using the latest version available at https://gitlab.com/gitlab-org/ci-cd/docker-machine/-/releases.
At this moment we have three different runners depending on the needs of the task. Every job that should run on our custom runners needs to have the tag aws and additionally one of the following:
speed: When a job running on Gitlab CI custom runners takes a lot of time or fails because of memory constraints, this is the first tag we need to add. A job with this tag will use a c5a.xlarge EC2 instance (4 vCPU and 8 GB RAM).memory: If a job still fails with the previous runner because it needs more memory, then we use this tag. It uses a t3a.xlarge EC2 instances (4 vCPU and 16 GB RAM).speed2x and memory2x: These to tags need to be combined and a job with them will use a c5a.2xlarge EC2 instance (8 vCPU and 16 GB RAM). At this moment only the jobs that build GraalVM from source code use this instance.There is another project in the same Gitlab organization that contain bash utility scripts to upgrade the different versions in all the test applications:
create-new-branch.sh: Create a new branch based off other one. Used when there is a new Micronaut minor or major version.upgrade-gradle-plugin-version.sh: Upgrade the Micronaut application Gradle plugin version.upgrade-gradle-version.sh: Upgrade Gradle Wrapper version.upgrade-micronaut-data-version.sh: Upgrade Micronaut Data version.upgrade-micronaut-version.sh: Upgrade Micronaut version.upgrade-shadow-plugin-version.sh: Upgrade Shadow plugin version.Before upgrading Netty to a new version in Micronaut core we need to make sure it works with GraalVM. In the past there has been issues and regressions introduced by Netty.
Use the basic-app for the test. Upgrade Netty in core, publish a local snapshot and use it in the application. Make sure the endpoints documented in the application works (specially hello and HTTP-client related).
Before upgrading versions of Liquibase and Flyway in the modules is necessary to make sure that they work with GraalVM. This is more important in Flyway because in Micronaut Flyway we have a few GraalVM substitutions from some internal Flyway classes. In the past, there has been issues in different Flyway versions because they team modified a constructor or added/removed a method.