Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump build images for python 3.12.6 updates #29963

Merged
merged 16 commits into from
Nov 8, 2024

Conversation

Kyle-Neale
Copy link
Contributor

@Kyle-Neale Kyle-Neale commented Oct 8, 2024

What does this PR do?

This PR updates the buildimages to 3.12.6

Motivation

The embedded python we use in the Agent was bumped so we bump the actual buildimages so we're building in the same Python environment

Describe how to test/QA your changes

This PR should successfully build the agent and run installer tests, asserting we use the expected python version

Possible Drawbacks / Trade-offs

Additional Notes

@agent-platform-auto-pr
Copy link
Contributor

agent-platform-auto-pr bot commented Oct 8, 2024

Gitlab CI Configuration Changes

Modified Jobs

variables (configuration)
  variables:
    AGENT_API_KEY_ORG2: agent-api-key-org-2
    AGENT_APP_KEY_ORG2: agent-ci-app-key-org-2
    AGENT_BINARIES_DIR: bin/agent
    AGENT_GITHUB_APP: agent-github-app
    AGENT_GITHUB_APP_ID: ci.datadog-agent.platform-github-app-id
    AGENT_GITHUB_INSTALLATION_ID: ci.datadog-agent.platform-github-app-installation-id
    AGENT_GITHUB_KEY: ci.datadog-agent.platform-github-app-key
    AGENT_QA_E2E: agent-qa-e2e
    AGENT_QA_PROFILE: ci.datadog-agent.agent-qa-profile
    API_KEY_DDDEV: ci.datadog-agent.datadog_api_key
    API_KEY_ORG2: ci.datadog-agent.datadog_api_key_org2
    APP_KEY_ORG2: ci.datadog-agent.datadog_app_key_org2
    ARTIFACT_DOWNLOAD_ATTEMPTS: 2
    ATLASSIAN_WRITE: atlassian-write
    BTFHUB_ARCHIVE_BRANCH: main
    BUCKET_BRANCH: dev
    CHANGELOG_COMMIT_SHA: ci.datadog-agent.gitlab_changelog_commit_sha
    CHOCOLATEY_API_KEY: ci.datadog-agent.chocolatey_api_key
-   CI_IMAGE_BTF_GEN: v47979770-a5a4cfd0
+   CI_IMAGE_BTF_GEN: v48262719-bfb00f80
    CI_IMAGE_BTF_GEN_SUFFIX: ''
-   CI_IMAGE_DD_AGENT_TESTING: v47979770-a5a4cfd0
?                                  ^^^^ ^^^^^^ ^
+   CI_IMAGE_DD_AGENT_TESTING: v48262719-bfb00f80
?                                ++++ + ^^^^ ^ ^
    CI_IMAGE_DD_AGENT_TESTING_SUFFIX: ''
-   CI_IMAGE_DEB_ARM64: v47979770-a5a4cfd0
+   CI_IMAGE_DEB_ARM64: v48262719-bfb00f80
    CI_IMAGE_DEB_ARM64_SUFFIX: ''
-   CI_IMAGE_DEB_ARMHF: v47979770-a5a4cfd0
+   CI_IMAGE_DEB_ARMHF: v48262719-bfb00f80
    CI_IMAGE_DEB_ARMHF_SUFFIX: ''
-   CI_IMAGE_DEB_X64: v47979770-a5a4cfd0
+   CI_IMAGE_DEB_X64: v48262719-bfb00f80
    CI_IMAGE_DEB_X64_SUFFIX: ''
-   CI_IMAGE_DOCKER_ARM64: v47979770-a5a4cfd0
+   CI_IMAGE_DOCKER_ARM64: v48262719-bfb00f80
    CI_IMAGE_DOCKER_ARM64_SUFFIX: ''
-   CI_IMAGE_DOCKER_X64: v47979770-a5a4cfd0
+   CI_IMAGE_DOCKER_X64: v48262719-bfb00f80
    CI_IMAGE_DOCKER_X64_SUFFIX: ''
-   CI_IMAGE_GITLAB_AGENT_DEPLOY: v47979770-a5a4cfd0
?                                     ^^^^ ^^^^^^ ^
+   CI_IMAGE_GITLAB_AGENT_DEPLOY: v48262719-bfb00f80
?                                   ++++ + ^^^^ ^ ^
    CI_IMAGE_GITLAB_AGENT_DEPLOY_SUFFIX: ''
-   CI_IMAGE_LINUX_GLIBC_2_17_X64: v47979770-a5a4cfd0
?                                      ^^^^ ^^^^^^ ^
+   CI_IMAGE_LINUX_GLIBC_2_17_X64: v48262719-bfb00f80
?                                    ++++ + ^^^^ ^ ^
    CI_IMAGE_LINUX_GLIBC_2_17_X64_SUFFIX: ''
-   CI_IMAGE_LINUX_GLIBC_2_23_ARM64: v47979770-a5a4cfd0
?                                        ^^^^ ^^^^^^ ^
+   CI_IMAGE_LINUX_GLIBC_2_23_ARM64: v48262719-bfb00f80
?                                      ++++ + ^^^^ ^ ^
    CI_IMAGE_LINUX_GLIBC_2_23_ARM64_SUFFIX: ''
-   CI_IMAGE_RPM_ARM64: v47979770-a5a4cfd0
+   CI_IMAGE_RPM_ARM64: v48262719-bfb00f80
    CI_IMAGE_RPM_ARM64_SUFFIX: ''
-   CI_IMAGE_RPM_ARMHF: v47979770-a5a4cfd0
+   CI_IMAGE_RPM_ARMHF: v48262719-bfb00f80
    CI_IMAGE_RPM_ARMHF_SUFFIX: ''
-   CI_IMAGE_RPM_X64: v47979770-a5a4cfd0
+   CI_IMAGE_RPM_X64: v48262719-bfb00f80
    CI_IMAGE_RPM_X64_SUFFIX: ''
-   CI_IMAGE_SYSTEM_PROBE_ARM64: v47979770-a5a4cfd0
?                                    ^^^^ ^^^^^^ ^
+   CI_IMAGE_SYSTEM_PROBE_ARM64: v48262719-bfb00f80
?                                  ++++ + ^^^^ ^ ^
    CI_IMAGE_SYSTEM_PROBE_ARM64_SUFFIX: ''
-   CI_IMAGE_SYSTEM_PROBE_X64: v47979770-a5a4cfd0
?                                  ^^^^ ^^^^^^ ^
+   CI_IMAGE_SYSTEM_PROBE_X64: v48262719-bfb00f80
?                                ++++ + ^^^^ ^ ^
    CI_IMAGE_SYSTEM_PROBE_X64_SUFFIX: ''
-   CI_IMAGE_WIN_1809_X64: v47979770-a5a4cfd0
+   CI_IMAGE_WIN_1809_X64: v48262719-bfb00f80
    CI_IMAGE_WIN_1809_X64_SUFFIX: ''
-   CI_IMAGE_WIN_LTSC2022_X64: v47979770-a5a4cfd0
?                                  ^^^^ ^^^^^^ ^
+   CI_IMAGE_WIN_LTSC2022_X64: v48262719-bfb00f80
?                                ++++ + ^^^^ ^ ^
    CI_IMAGE_WIN_LTSC2022_X64_SUFFIX: ''
    CLANG_LLVM_VER: 12.0.1
    CLUSTER_AGENT_BINARIES_DIR: bin/datadog-cluster-agent
    CLUSTER_AGENT_CLOUDFOUNDRY_BINARIES_DIR: bin/datadog-cluster-agent-cloudfoundry
    CODECOV: codecov
    CODECOV_TOKEN: ci.datadog-agent.codecov_token
    CWS_INSTRUMENTATION_BINARIES_DIR: bin/cws-instrumentation
-   DATADOG_AGENT_ARMBUILDIMAGES: v47979770-a5a4cfd0
?                                     ^^^^ ^^^^^^ ^
+   DATADOG_AGENT_ARMBUILDIMAGES: v48262719-bfb00f80
?                                   ++++ + ^^^^ ^ ^
    DATADOG_AGENT_ARMBUILDIMAGES_SUFFIX: ''
-   DATADOG_AGENT_BTF_GEN_BUILDIMAGES: v47979770-a5a4cfd0
?                                          ^^^^ ^^^^^^ ^
+   DATADOG_AGENT_BTF_GEN_BUILDIMAGES: v48262719-bfb00f80
?                                        ++++ + ^^^^ ^ ^
    DATADOG_AGENT_BTF_GEN_BUILDIMAGES_SUFFIX: ''
-   DATADOG_AGENT_BUILDIMAGES: v47979770-a5a4cfd0
?                                  ^^^^ ^^^^^^ ^
+   DATADOG_AGENT_BUILDIMAGES: v48262719-bfb00f80
?                                ++++ + ^^^^ ^ ^
    DATADOG_AGENT_BUILDIMAGES_SUFFIX: ''
    DATADOG_AGENT_EMBEDDED_PATH: /opt/datadog-agent/embedded
-   DATADOG_AGENT_SYSPROBE_BUILDIMAGES: v47979770-a5a4cfd0
?                                           ^^^^ ^^^^^^ ^
+   DATADOG_AGENT_SYSPROBE_BUILDIMAGES: v48262719-bfb00f80
?                                         ++++ + ^^^^ ^ ^
    DATADOG_AGENT_SYSPROBE_BUILDIMAGES_SUFFIX: ''
-   DATADOG_AGENT_WINBUILDIMAGES: v47979770-a5a4cfd0
?                                     ^^^^ ^^^^^^ ^
+   DATADOG_AGENT_WINBUILDIMAGES: v48262719-bfb00f80
?                                   ++++ + ^^^^ ^ ^
    DATADOG_AGENT_WINBUILDIMAGES_SUFFIX: ''
    DD_AGENT_TESTING_DIR: $CI_PROJECT_DIR/test/kitchen
    DD_PKG_VERSION: latest
    DEB_GPG_KEY: ci.datadog-agent.deb_signing_private_key_${DEB_GPG_KEY_ID}
    DEB_GPG_KEY_ID: c0962c7d
    DEB_GPG_KEY_NAME: Datadog, Inc. APT key
    DEB_RPM_TESTING_BUCKET_BRANCH: testing
    DEB_S3_BUCKET: apt.datad0g.com
    DEB_SIGNING_PASSPHRASE: ci.datadog-agent.deb_signing_key_passphrase_${DEB_GPG_KEY_ID}
    DEB_TESTING_S3_BUCKET: apttesting.datad0g.com
    DOCKER_REGISTRY_LOGIN: ci.datadog-agent.docker_hub_login
    DOCKER_REGISTRY_PWD: ci.datadog-agent.docker_hub_pwd
    DOCKER_REGISTRY_RO: dockerhub-readonly
    DOCKER_REGISTRY_URL: docker.io
    DOGSTATSD_BINARIES_DIR: bin/dogstatsd
    E2E_PULUMI_CONFIG_PASSPHRASE: ci.datadog-agent.pulumi_password
    E2E_TESTS_API_KEY: ci.datadog-agent.e2e_tests_api_key
    E2E_TESTS_APP_KEY: ci.datadog-agent.e2e_tests_app_key
    E2E_TESTS_AZURE_CLIENT_ID: ci.datadog-agent.e2e_tests_azure_client_id
    E2E_TESTS_AZURE_CLIENT_SECRET: ci.datadog-agent.e2e_tests_azure_client_secret
    E2E_TESTS_AZURE_SUBSCRIPTION_ID: ci.datadog-agent.e2e_tests_azure_subscription_id
    E2E_TESTS_AZURE_TENANT_ID: ci.datadog-agent.e2e_tests_azure_tenant_id
    E2E_TESTS_GCP_CREDENTIALS: ci.datadog-agent.e2e_tests_gcp_credentials
    E2E_TESTS_RC_KEY: ci.datadog-agent.e2e_tests_rc_key
    EXECUTOR_JOB_SECTION_ATTEMPTS: 2
    FF_KUBERNETES_HONOR_ENTRYPOINT: true
    FF_SCRIPT_SECTIONS: 1
    GENERAL_ARTIFACTS_CACHE_BUCKET_URL: https://dd-agent-omnibus.s3.amazonaws.com
    GET_SOURCES_ATTEMPTS: 2
    GITHUB_PR_COMMENTER_APP_KEY: pr-commenter.github_app_key
    GITHUB_PR_COMMENTER_INSTALLATION_ID: pr-commenter.github_installation_id
    GITHUB_PR_COMMENTER_INTEGRATION_ID: pr-commenter.github_integration_id
    GITLAB_FULL_API_TOKEN: ci.datadog-agent.gitlab_full_api_token
    GITLAB_READ_API_TOKEN: ci.datadog-agent.gitlab_read_api_token
    GITLAB_SCHEDULER_TOKEN: ci.datadog-agent.gitlab_pipelines_scheduler_token
    GITLAB_TOKEN: gitlab-token
    GO_TEST_SKIP_FLAKE: 'true'
    INSTALL_SCRIPT_API_KEY: ci.agent-linux-install-script.datadog_api_key_2
    INSTALL_SCRIPT_API_KEY_ORG2: install-script-api-key-org-2
    INTEGRATION_WHEELS_CACHE_BUCKET: dd-agent-omnibus
    JIRA_READ_API_TOKEN: ci.datadog-agent.jira_read_api_token
    KERNEL_MATRIX_TESTING_ARM_AMI_ID: ami-0b5f838a19d37fc61
    KERNEL_MATRIX_TESTING_X86_AMI_ID: ami-05b3973acf5422348
    KITCHEN_AZURE_CLIENT_ID: ci.datadog-agent.azure_kitchen_client_id
    KITCHEN_AZURE_CLIENT_SECRET: ci.datadog-agent.azure_kitchen_client_secret
    KITCHEN_AZURE_SUBSCRIPTION_ID: ci.datadog-agent.azure_kitchen_subscription_id
    KITCHEN_AZURE_TENANT_ID: ci.datadog-agent.azure_kitchen_tenant_id
    KITCHEN_EC2_SSH_KEY: ci.datadog-agent.aws_ec2_kitchen_ssh_key
    KITCHEN_INFRASTRUCTURE_FLAKES_RETRY: 2
    MACOS_GITHUB_APP_1: macos-github-app-one
    MACOS_GITHUB_APP_2: macos-github-app-two
    MACOS_GITHUB_APP_ID: ci.datadog-agent.macos_github_app_id
    MACOS_GITHUB_APP_ID_2: ci.datadog-agent.macos_github_app_id_2
    MACOS_GITHUB_INSTALLATION_ID: ci.datadog-agent.macos_github_installation_id
    MACOS_GITHUB_INSTALLATION_ID_2: ci.datadog-agent.macos_github_installation_id_2
    MACOS_GITHUB_KEY: ci.datadog-agent.macos_github_key_b64
    MACOS_GITHUB_KEY_2: ci.datadog-agent.macos_github_key_b64_2
    MACOS_S3_BUCKET: dd-agent-macostesting
    OMNIBUS_BASE_DIR: /omnibus
    OMNIBUS_GIT_CACHE_DIR: /tmp/omnibus-git-cache
    OMNIBUS_PACKAGE_DIR: $CI_PROJECT_DIR/omnibus/pkg/
    OMNIBUS_PACKAGE_DIR_SUSE: $CI_PROJECT_DIR/omnibus/suse/pkg
    PROCESS_S3_BUCKET: datad0g-process-agent
    RELEASE_VERSION_6: nightly
    RELEASE_VERSION_7: nightly-a7
    RESTORE_CACHE_ATTEMPTS: 2
    RPM_GPG_KEY: ci.datadog-agent.rpm_signing_private_key_${RPM_GPG_KEY_ID}
    RPM_GPG_KEY_ID: b01082d3
    RPM_GPG_KEY_NAME: Datadog, Inc. RPM key
    RPM_S3_BUCKET: yum.datad0g.com
    RPM_SIGNING_PASSPHRASE: ci.datadog-agent.rpm_signing_key_passphrase_${RPM_GPG_KEY_ID}
    RPM_TESTING_S3_BUCKET: yumtesting.datad0g.com
    RUN_E2E_TESTS: auto
    RUN_KMT_TESTS: auto
    RUN_UNIT_TESTS: auto
    S3_ARTIFACTS_URI: s3://dd-ci-artefacts-build-stable/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    S3_CP_CMD: aws s3 cp $S3_CP_OPTIONS
    S3_CP_OPTIONS: --no-progress --region us-east-1 --sse AES256
    S3_DD_AGENT_OMNIBUS_BTFS_URI: s3://dd-agent-omnibus/btfs
    S3_DD_AGENT_OMNIBUS_LLVM_URI: s3://dd-agent-omnibus/llvm
    S3_DSD6_URI: s3://dsd6-staging
    S3_OMNIBUS_CACHE_BUCKET: dd-ci-datadog-agent-omnibus-cache-build-stable
    S3_PERMANENT_ARTIFACTS_URI: s3://dd-ci-persistent-artefacts-build-stable/$CI_PROJECT_NAME
    S3_PROJECT_ARTIFACTS_URI: s3://dd-ci-artefacts-build-stable/$CI_PROJECT_NAME
    S3_RELEASE_ARTIFACTS_URI: s3://dd-release-artifacts/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    S3_RELEASE_INSTALLER_ARTIFACTS_URI: s3://dd-release-artifacts/datadog-installer/$CI_PIPELINE_ID
    S3_SBOM_STORAGE_URI: s3://sbom-root-us1-ddbuild-io/$CI_PROJECT_NAME/$CI_PIPELINE_ID
    SLACK_AGENT: slack-agent-ci
    SLACK_AGENT_CI_TOKEN: ci.datadog-agent.slack_agent_ci_token
    SMP_ACCOUNT: smp
    SMP_ACCOUNT_ID: ci.datadog-agent.single-machine-performance-account-id
    SMP_AGENT_TEAM_ID: ci.datadog-agent.single-machine-performance-agent-team-id
    SMP_API: ci.datadog-agent.single-machine-performance-api
    SMP_BOT_ACCESS_KEY: ci.datadog-agent.single-machine-performance-bot-access-key
    SMP_BOT_ACCESS_KEY_ID: ci.datadog-agent.single-machine-performance-bot-access-key-id
    SSH_KEY: ci.datadog-agent.ssh_key
    SSH_KEY_RSA: ci.datadog-agent.ssh_key_rsa
    SSH_PUBLIC_KEY_RSA: ci.datadog-agent.ssh_public_key_rsa
    STATIC_BINARIES_DIR: bin/static
    SYSTEM_PROBE_BINARIES_DIR: bin/system-probe
    USE_S3_CACHING: --omnibus-s3-cache
    VCPKG_BLOB_SAS_URL: ci.datadog-agent-buildimages.vcpkg_blob_sas_url
    WINDOWS_BUILDS_S3_BUCKET: $WIN_S3_BUCKET/builds
    WINDOWS_POWERSHELL_DIR: $CI_PROJECT_DIR/signed_scripts
    WINDOWS_TESTING_S3_BUCKET_A6: pipelines/A6/$CI_PIPELINE_ID
    WINDOWS_TESTING_S3_BUCKET_A7: pipelines/A7/$CI_PIPELINE_ID
    WINGET_PAT: ci.datadog-agent.winget_pat
    WIN_S3_BUCKET: dd-agent-mstesting
.failure_summary_setup
  .failure_summary_setup:
  - SLACK_API_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SLACK_AGENT token) ||
    exit $?; export SLACK_API_TOKEN
  - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN read_api)
    || exit $?; export GITLAB_TOKEN
  - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
    || exit $?; export DD_API_KEY
  - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-notifications.txt
+   --break-system-packages
.lint_macos_gitlab
  .lint_macos_gitlab:
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e linter.go --cpus 12 --debug --timeout 60
    stage: lint
.macos_gitlab
  .macos_gitlab:
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
.package_oci
  .package_oci:
    artifacts:
      paths:
      - ${OMNIBUS_PACKAGE_DIR}
    before_script:
    - PACKAGE_VERSION="$(inv agent.version --url-safe --major-version 7)-1" || exit
      $?
    - export INSTALL_DIR=/opt/datadog-packages/${OCI_PRODUCT}/${PACKAGE_VERSION}
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - rm -f $OMNIBUS_PACKAGE_DIR/*-dbg-*.tar.xz
    - ls -l $OMNIBUS_PACKAGE_DIR
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - set +x
    - git config --global url."https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.ddbuild.io/DataDog/".insteadOf
      "https://github.com/DataDog/"
    - go env -w GOPRIVATE="github.com/DataDog/*"
    - ${CI_PROJECT_DIR}/tools/ci/retry.sh go install github.com/DataDog/datadog-packages/cmd/datadog-package@latest
    - OUTPUT_DIR="/tmp/oci_output"
    - mkdir -p ${OUTPUT_DIR}
    - ls $OMNIBUS_PACKAGE_DIR
    - "if [ $(ls $OMNIBUS_PACKAGE_DIR/*.oci.tar 2> /dev/null | wc -l) -ge 1 ]; then\n\
      \  echo \"Copying already built images to output dir\"\n  cp $OMNIBUS_PACKAGE_DIR/*.oci.tar\
      \ ${OUTPUT_DIR}\nfi\n"
    - "for ARCH in \"amd64\" \"arm64\"; do\n  INPUT_FILE=\"${OMNIBUS_PACKAGE_DIR}${OCI_PRODUCT}-*${ARCH}.tar.xz\"\
      \n  OUTPUT_FILE=\"$(basename -a -s .xz ${INPUT_FILE})\"\n  MERGED_FILE=$(basename\
      \ -a $OMNIBUS_PACKAGE_DIR/*.tar.xz | head -n 1 | sed \"s/-${ARCH}.tar.xz//\").oci.tar\n\
      \  export MERGED_FILE\n  INPUT_DIR=\"/tmp/input_${ARCH}\"\n  mkdir -p ${INPUT_DIR}\n\
      \  echo \"Generating OCI for $ARCH.\"\n  echo \"Extracting to temporary input\
      \ dir $INPUT_FILE -> $INPUT_DIR\"\n  tar xJf ${INPUT_FILE} -C ${INPUT_DIR}\n \
      \ echo \"Creating OCI layer -> ${OUTPUT_DIR}/${OUTPUT_FILE}\"\n  if [ \"${OCI_PRODUCT}\"\
      \ = \"datadog-agent\" ]; then\n    EXTRA_FLAGS=\"--configs ${INPUT_DIR}/etc/datadog-agent\"\
      \n  fi\n  datadog-package create \\\n    --version ${PACKAGE_VERSION} \\\n   \
      \ --package ${OCI_PRODUCT} \\\n    --os linux \\\n    --arch ${ARCH} \\\n    --archive\
      \ --archive-path \"${OUTPUT_DIR}/${OUTPUT_FILE}\" \\\n    ${EXTRA_FLAGS} \\\n\
      \    ${INPUT_DIR}/${INSTALL_DIR}/\n  rm -f ${INPUT_FILE}\ndone\n"
    - echo "Aggregating all layers into one package -> ${MERGED_FILE}"
    - ls -l ${OUTPUT_DIR}/
    - datadog-package merge ${OUTPUT_DIR}/*.tar
    - mv merged.tar ${OMNIBUS_PACKAGE_DIR}/${MERGED_FILE}
    stage: packaging
    tags:
    - arch:amd64
    variables:
      KUBERNETES_CPU_REQUEST: 16
      KUBERNETES_MEMORY_LIMIT: 32Gi
      KUBERNETES_MEMORY_REQUEST: 32Gi
.tests_macos_gitlab
  .tests_macos_gitlab:
    allow_failure: true
    artifacts:
      expire_in: 2 weeks
      paths:
      - $TEST_OUTPUT_FILE
      - junit-*.tgz
      reports:
        annotations:
        - $EXTERNAL_LINKS_PATH
        junit: '**/junit-out-*.xml'
      when: always
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e gitlab.generate-ci-visibility-links --output=$EXTERNAL_LINKS_PATH
    - FAST_TESTS_FLAG=""
    - if [[ "$FAST_TESTS" == "true" ]]; then FAST_TESTS_FLAG="--only-impacted-packages";
      fi
    - inv -e test --rerun-fails=2 --race --profile --cpus 12 --save-result-json $TEST_OUTPUT_FILE
      --junit-tar "junit-${CI_JOB_NAME}.tgz" $FAST_TESTS_FLAG --test-washer
    - inv -e invoke-unit-tests
    stage: source_test
    variables:
      TEST_OUTPUT_FILE: test_output.json
agent_oci
  agent_oci:
    artifacts:
      paths:
      - ${OMNIBUS_PACKAGE_DIR}
    before_script:
    - PACKAGE_VERSION="$(inv agent.version --url-safe --major-version 7)-1" || exit
      $?
    - export INSTALL_DIR=/opt/datadog-packages/${OCI_PRODUCT}/${PACKAGE_VERSION}
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - datadog-agent-oci-x64-a7
    - datadog-agent-oci-arm64-a7
    - windows_msi_and_bosh_zip_x64-a7
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - rm -f $OMNIBUS_PACKAGE_DIR/*-dbg-*.tar.xz
    - ls -l $OMNIBUS_PACKAGE_DIR
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - set +x
    - git config --global url."https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.ddbuild.io/DataDog/".insteadOf
      "https://github.com/DataDog/"
    - go env -w GOPRIVATE="github.com/DataDog/*"
    - ${CI_PROJECT_DIR}/tools/ci/retry.sh go install github.com/DataDog/datadog-packages/cmd/datadog-package@latest
    - OUTPUT_DIR="/tmp/oci_output"
    - mkdir -p ${OUTPUT_DIR}
    - ls $OMNIBUS_PACKAGE_DIR
    - "if [ $(ls $OMNIBUS_PACKAGE_DIR/*.oci.tar 2> /dev/null | wc -l) -ge 1 ]; then\n\
      \  echo \"Copying already built images to output dir\"\n  cp $OMNIBUS_PACKAGE_DIR/*.oci.tar\
      \ ${OUTPUT_DIR}\nfi\n"
    - "for ARCH in \"amd64\" \"arm64\"; do\n  INPUT_FILE=\"${OMNIBUS_PACKAGE_DIR}${OCI_PRODUCT}-*${ARCH}.tar.xz\"\
      \n  OUTPUT_FILE=\"$(basename -a -s .xz ${INPUT_FILE})\"\n  MERGED_FILE=$(basename\
      \ -a $OMNIBUS_PACKAGE_DIR/*.tar.xz | head -n 1 | sed \"s/-${ARCH}.tar.xz//\").oci.tar\n\
      \  export MERGED_FILE\n  INPUT_DIR=\"/tmp/input_${ARCH}\"\n  mkdir -p ${INPUT_DIR}\n\
      \  echo \"Generating OCI for $ARCH.\"\n  echo \"Extracting to temporary input\
      \ dir $INPUT_FILE -> $INPUT_DIR\"\n  tar xJf ${INPUT_FILE} -C ${INPUT_DIR}\n \
      \ echo \"Creating OCI layer -> ${OUTPUT_DIR}/${OUTPUT_FILE}\"\n  if [ \"${OCI_PRODUCT}\"\
      \ = \"datadog-agent\" ]; then\n    EXTRA_FLAGS=\"--configs ${INPUT_DIR}/etc/datadog-agent\"\
      \n  fi\n  datadog-package create \\\n    --version ${PACKAGE_VERSION} \\\n   \
      \ --package ${OCI_PRODUCT} \\\n    --os linux \\\n    --arch ${ARCH} \\\n    --archive\
      \ --archive-path \"${OUTPUT_DIR}/${OUTPUT_FILE}\" \\\n    ${EXTRA_FLAGS} \\\n\
      \    ${INPUT_DIR}/${INSTALL_DIR}/\n  rm -f ${INPUT_FILE}\ndone\n"
    - echo "Aggregating all layers into one package -> ${MERGED_FILE}"
    - ls -l ${OUTPUT_DIR}/
    - datadog-package merge ${OUTPUT_DIR}/*.tar
    - mv merged.tar ${OMNIBUS_PACKAGE_DIR}/${MERGED_FILE}
    stage: packaging
    tags:
    - arch:amd64
    variables:
      KUBERNETES_CPU_REQUEST: 16
      KUBERNETES_MEMORY_LIMIT: 32Gi
      KUBERNETES_MEMORY_REQUEST: 32Gi
      OCI_PRODUCT: datadog-agent
close_failing_tests_stale_issues
  close_failing_tests_stale_issues:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_arm64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs: []
    rules:
    - if: $CI_COMMIT_BRANCH != "main" || $CI_PIPELINE_SOURCE != "schedule"
      when: never
    - if: $BUCKET_BRANCH != "nightly" && $BUCKET_BRANCH != "oldnightly" && $BUCKET_BRANCH
        != "dev"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
      when: always
    script:
    - weekday="$(date --utc '+%A')"
    - "if [ \"$weekday\" != \"Friday\" ]; then\n  echo \"This script is run weekly on\
      \ Fridays\"\n  exit\nfi\n"
    - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
      || exit $?; export DD_API_KEY
    - DD_APP_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_APP_KEY_ORG2 token)
      || exit $?; export DD_APP_KEY
    - ATLASSIAN_PASSWORD=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      token) || exit $?; export ATLASSIAN_PASSWORD
    - ATLASSIAN_USERNAME=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      user) || exit $?; export ATLASSIAN_USERNAME
    - python3 -m pip install -r requirements.txt -r tasks/requirements_release_tasks.txt
+     --break-system-packages
    - inv -e notify.close-failing-tests-stale-issues
    stage: notify
    tags:
    - arch:arm64
create_release_qa_cards
  create_release_qa_cards:
    allow_failure: true
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    rules:
    - if: $DEPLOY_AGENT != "true" && $DDR_WORKFLOW_ID == null
      when: never
    - if: ($DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null) && $BUCKET_BRANCH ==
        "beta" && $CI_COMMIT_TAG =~ /^[0-9]+\.[0-9]+\.[0-9]+-rc\.[0-9]+$/
      variables:
        AGENT_REPOSITORY: agent
        DSD_REPOSITORY: dogstatsd
        IMG_REGISTRIES: public
      when: on_success
    script:
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
    - ATLASSIAN_PASSWORD=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      token) || exit $?; export ATLASSIAN_PASSWORD
    - ATLASSIAN_USERNAME=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      user) || exit $?; export ATLASSIAN_USERNAME
-   - pip install ddqa
+   - pip install ddqa --break-system-packages
    - inv release.create-qa-cards -t ${CI_COMMIT_REF_NAME}
    stage: .pre
    tags:
    - arch:amd64
github_rate_limit_info
  github_rate_limit_info:
    allow_failure: true
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - python3 -m pip install -r tasks/libs/requirements-github.txt datadog_api_client
+     --break-system-packages
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_1
      key_b64) || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_1 app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_1
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
      || exit $?; export DD_API_KEY
    - inv github.send-rate-limit-info-datadog --pipeline-id $CI_PIPELINE_ID --app-instance
      1
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2
      key_b64) || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2 app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $MACOS_GITHUB_APP_2
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
      || exit $?; export DD_API_KEY
    - inv github.send-rate-limit-info-datadog --pipeline-id $CI_PIPELINE_ID --app-instance
      2
    stage: .pre
    tags:
    - arch:amd64
installer_oci
  installer_oci:
    artifacts:
      paths:
      - ${OMNIBUS_PACKAGE_DIR}
    before_script:
    - PACKAGE_VERSION="$(inv agent.version --url-safe --major-version 7)-1" || exit
      $?
    - export INSTALL_DIR=/opt/datadog-packages/${OCI_PRODUCT}/${PACKAGE_VERSION}
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - installer-arm64-oci
    - installer-amd64-oci
    - windows-installer-amd64
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - rm -f $OMNIBUS_PACKAGE_DIR/*-dbg-*.tar.xz
    - ls -l $OMNIBUS_PACKAGE_DIR
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - set +x
    - git config --global url."https://gitlab-ci-token:${CI_JOB_TOKEN}@gitlab.ddbuild.io/DataDog/".insteadOf
      "https://github.com/DataDog/"
    - go env -w GOPRIVATE="github.com/DataDog/*"
    - ${CI_PROJECT_DIR}/tools/ci/retry.sh go install github.com/DataDog/datadog-packages/cmd/datadog-package@latest
    - OUTPUT_DIR="/tmp/oci_output"
    - mkdir -p ${OUTPUT_DIR}
    - ls $OMNIBUS_PACKAGE_DIR
    - "if [ $(ls $OMNIBUS_PACKAGE_DIR/*.oci.tar 2> /dev/null | wc -l) -ge 1 ]; then\n\
      \  echo \"Copying already built images to output dir\"\n  cp $OMNIBUS_PACKAGE_DIR/*.oci.tar\
      \ ${OUTPUT_DIR}\nfi\n"
    - "for ARCH in \"amd64\" \"arm64\"; do\n  INPUT_FILE=\"${OMNIBUS_PACKAGE_DIR}${OCI_PRODUCT}-*${ARCH}.tar.xz\"\
      \n  OUTPUT_FILE=\"$(basename -a -s .xz ${INPUT_FILE})\"\n  MERGED_FILE=$(basename\
      \ -a $OMNIBUS_PACKAGE_DIR/*.tar.xz | head -n 1 | sed \"s/-${ARCH}.tar.xz//\").oci.tar\n\
      \  export MERGED_FILE\n  INPUT_DIR=\"/tmp/input_${ARCH}\"\n  mkdir -p ${INPUT_DIR}\n\
      \  echo \"Generating OCI for $ARCH.\"\n  echo \"Extracting to temporary input\
      \ dir $INPUT_FILE -> $INPUT_DIR\"\n  tar xJf ${INPUT_FILE} -C ${INPUT_DIR}\n \
      \ echo \"Creating OCI layer -> ${OUTPUT_DIR}/${OUTPUT_FILE}\"\n  if [ \"${OCI_PRODUCT}\"\
      \ = \"datadog-agent\" ]; then\n    EXTRA_FLAGS=\"--configs ${INPUT_DIR}/etc/datadog-agent\"\
      \n  fi\n  datadog-package create \\\n    --version ${PACKAGE_VERSION} \\\n   \
      \ --package ${OCI_PRODUCT} \\\n    --os linux \\\n    --arch ${ARCH} \\\n    --archive\
      \ --archive-path \"${OUTPUT_DIR}/${OUTPUT_FILE}\" \\\n    ${EXTRA_FLAGS} \\\n\
      \    ${INPUT_DIR}/${INSTALL_DIR}/\n  rm -f ${INPUT_FILE}\ndone\n"
    - echo "Aggregating all layers into one package -> ${MERGED_FILE}"
    - ls -l ${OUTPUT_DIR}/
    - datadog-package merge ${OUTPUT_DIR}/*.tar
    - mv merged.tar ${OMNIBUS_PACKAGE_DIR}/${MERGED_FILE}
    stage: packaging
    tags:
    - arch:amd64
    variables:
      KUBERNETES_CPU_REQUEST: 16
      KUBERNETES_MEMORY_LIMIT: 32Gi
      KUBERNETES_MEMORY_REQUEST: 32Gi
      OCI_PRODUCT: datadog-installer
invoke_unit_tests
  invoke_unit_tests:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs: []
    rules:
    - if: $CI_COMMIT_BRANCH == "main"
    - changes:
        compare_to: main
        paths:
        - tasks/**/*
    script:
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - inv -e invoke-unit-tests.run
    stage: source_test
    tags:
    - arch:amd64
kitchen_invoke_unit_tests
  kitchen_invoke_unit_tests:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs: []
    rules:
    - if: $CI_COMMIT_BRANCH == "main"
    - changes:
        compare_to: main
        paths:
        - test/kitchen/tasks/**/*
    script:
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - pushd test/kitchen
    - inv -e kitchen.invoke-unit-tests
    - popd
    stage: source_test
    tags:
    - arch:amd64
lint_macos_gitlab_amd64
  lint_macos_gitlab_amd64:
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e linter.go --cpus 12 --debug --timeout 60
    stage: lint
    tags:
    - macos:monterey-amd64
    - specific:true
lint_macos_gitlab_arm64
  lint_macos_gitlab_arm64:
    allow_failure: true
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH == "main"
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - allow_failure: true
      when: manual
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e linter.go --cpus 12 --debug --timeout 60
    stage: lint
    tags:
    - macos:monterey-arm64
    - specific:true
notify-slack
  notify-slack:
    image: registry.ddbuild.io/slack-notifier:v27936653-9a2a7db-sdm-gbi-jammy@sha256:c9d1145319d1904fa72ea97904a15200d3cb684324723f9e1700bc02cc85065c
    needs:
    - internal_kubernetes_deploy_experimental
    rules:
    - if: $FORCE_K8S_DEPLOYMENT == "true"
      when: always
    - if: $CI_COMMIT_BRANCH != "main"
      when: never
    - if: $DDR != "true"
      when: never
    - if: $APPS !~ "/^datadog-agent/"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
    script:
    - export SDM_JWT=$(vault read -field=token identity/oidc/token/sdm)
-   - python3 -m pip install -r tasks/requirements.txt
+   - python3 -m pip install -r tasks/requirements.txt --break-system-packages
?                                                     ++++++++++++++++++++++++
    - inv pipeline.changelog ${CI_COMMIT_SHORT_SHA} || exit $?
    stage: internal_kubernetes_deploy
    tags:
    - arch:amd64
notify_ebpf_complexity_changes
  notify_ebpf_complexity_changes:
    allow_failure: true
    before_script:
-   - python3 -m pip install tabulate
+   - python3 -m pip install -r tasks/kernel_matrix_testing/requirements.txt --break-system-packages
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN read_api)
      || exit $?; export GITLAB_TOKEN
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - job: kmt_run_sysprobe_tests_x64
      optional: true
      parallel:
        matrix:
        - TAG:
          - amazon_5.4
          - debian_10
          - ubuntu_18.04
          - centos_8
          - opensuse_15.3
          - suse_12.5
          - fedora_38
          TEST_SET: no_usm
    - job: kmt_run_sysprobe_tests_arm64
      optional: true
      parallel:
        matrix:
        - TAG:
          - amazon_5.4
          - debian_10
          - ubuntu_18.04
          - centos_8
          - opensuse_15.3
          - suse_12.5
          - fedora_38
          TEST_SET: no_usm
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $CI_COMMIT_BRANCH == "main"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
      when: never
    - if: $CI_COMMIT_BRANCH == "main"
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_KMT_TESTS == 'on'
    - changes:
        compare_to: main
        paths:
        - pkg/collector/corechecks/ebpf/**/*
        - pkg/collector/corechecks/servicediscovery/module/*
        - pkg/ebpf/**/*
        - pkg/network/**/*
        - pkg/process/monitor/*
        - pkg/util/kernel/**/*
        - pkg/dynamicinstrumentation/**/*
        - pkg/gpu/**/*
        - .gitlab/kernel_matrix_testing/system_probe.yml
        - .gitlab/kernel_matrix_testing/common.yml
        - .gitlab/source_test/ebpf.yml
        - test/new-e2e/system-probe/**/*
        - test/new-e2e/scenarios/system-probe/**/*
        - test/new-e2e/pkg/runner/**/*
        - test/new-e2e/pkg/utils/**/*
        - test/new-e2e/go.mod
        - tasks/system_probe.py
        - tasks/kmt.py
        - tasks/kernel_matrix_testing/*
    - allow_failure: true
      if: $CI_COMMIT_BRANCH == "main"
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_KMT_TESTS == 'on'
    - changes:
        compare_to: main
        paths:
        - pkg/ebpf/**/*
        - pkg/security/**/*
        - pkg/eventmonitor/**/*
        - .gitlab/kernel_matrix_testing/security_agent.yml
        - .gitlab/kernel_matrix_testing/common.yml
        - .gitlab/source_test/ebpf.yml
        - test/new-e2e/system-probe/**/*
        - test/new-e2e/scenarios/system-probe/**/*
        - test/new-e2e/pkg/runner/**/*
        - test/new-e2e/pkg/utils/**/*
        - test/new-e2e/go.mod
        - tasks/security_agent.py
        - tasks/kmt.py
        - tasks/kernel_matrix_testing/*
    - allow_failure: true
      when: manual
    script:
    - inv -e ebpf.generate-complexity-summary-for-pr
    stage: notify
    tags:
    - arch:amd64
notify_failure_summary_daily
  notify_failure_summary_daily:
    dependencies: []
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    resource_group: notification
    rules:
    - if: $CI_COMMIT_BRANCH != "main" || $CI_PIPELINE_SOURCE != "schedule"
      when: never
    - if: $BUCKET_BRANCH != "nightly" && $BUCKET_BRANCH != "oldnightly" && $BUCKET_BRANCH
        != "dev"
      when: never
    - if: $DEPLOY_AGENT == "true" || $DDR_WORKFLOW_ID != null
      when: always
    script:
    - SLACK_API_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SLACK_AGENT token)
      || exit $?; export SLACK_API_TOKEN
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN read_api)
      || exit $?; export GITLAB_TOKEN
    - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
      || exit $?; export DD_API_KEY
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-notifications.txt
+     --break-system-packages
    - weekday="$(date --utc '+%A')"
    - "if [ \"$weekday\" = \"Sunday\" ] || [ \"$weekday\" = \"Monday\" ]; then\n  echo\
      \ \"Skipping daily summary on $weekday\"\n  exit\nfi\n"
    - inv -e notify.failure-summary-send-notifications --daily-summary
    - "if [ \"$weekday\" = \"Friday\" ]; then\n  echo 'Sending weekly summary'\n  inv\
      \ -e notify.failure-summary-send-notifications --weekly-summary\nfi\n"
    stage: notify
    tags:
    - arch:amd64
    timeout: 15 minutes
notify_failure_summary_on_pipeline
  notify_failure_summary_on_pipeline:
    dependencies: []
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    resource_group: notification
    rules:
    - if: $CI_PIPELINE_SOURCE != "push" && $CI_PIPELINE_SOURCE != "schedule"
      when: never
    - if: $CI_COMMIT_BRANCH == "main"
      when: always
    script:
    - SLACK_API_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $SLACK_AGENT token)
      || exit $?; export SLACK_API_TOKEN
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN read_api)
      || exit $?; export GITLAB_TOKEN
    - DD_API_KEY=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_API_KEY_ORG2 token)
      || exit $?; export DD_API_KEY
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-notifications.txt
+     --break-system-packages
    - inv -e notify.failure-summary-upload-pipeline-data
    stage: notify
    tags:
    - arch:amd64
    timeout: 15 minutes
notify_github
  notify_github:
    allow_failure: true
    dependencies: []
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_arm64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - job: deploy_deb_testing-a7_x64
      optional: true
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $CI_COMMIT_BRANCH == "main"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $DEPLOY_AGENT == "false" && $DDR_WORKFLOW_ID == null && $RUN_E2E_TESTS ==
        "off"
      when: never
    - changes:
        compare_to: main
        paths:
        - '**/*.go'
      if: $RELEASE_VERSION_7 != ""
      when: on_success
    - when: never
    script:
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - messagefile="$(mktemp)"
    - echo "Use this command from [test-infra-definitions](https://github.com/DataDog/test-infra-definitions)
      to manually test this PR changes on a VM:" >> "$messagefile"
    - echo '```sh' >> "$messagefile"
    - echo "inv create-vm --pipeline-id=$CI_PIPELINE_ID --os-family=ubuntu" >> "$messagefile"
    - echo '```' >> "$messagefile"
    - 'echo "Note: This applies to commit **$CI_COMMIT_SHORT_SHA**" >> "$messagefile"'
    - inv -e github.pr-commenter --title "Test changes on VM" --body "$(cat "$messagefile")"
      --echo
    - rm "$messagefile"
    stage: notify
    tags:
    - arch:arm64
notify_gitlab_ci_changes
  notify_gitlab_ci_changes:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - compute_gitlab_ci_config
    rules:
    - changes:
        compare_to: main
        paths:
        - .gitlab-ci.yml
        - .gitlab/**/*.yml
    script:
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
    - inv -e notify.gitlab-ci-diff --from-diff artifacts/diff.gitlab-ci.yml --pr-comment
    stage: notify
    tags:
    - arch:amd64
slack_teams_channels_check
  slack_teams_channels_check:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs: []
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - python3 -m pip install codeowners -c tasks/libs/requirements-notifications.txt
+     --break-system-packages
    - inv -e notify.check-teams
    stage: source_test
    tags:
    - arch:amd64
test_gitlab_compare_to
  test_gitlab_compare_to:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - changes:
        compare_to: main
        paths:
        - .gitlab-ci.yml
        - .gitlab/**/*
        - .gitlab/**/.*
    script:
    - GITLAB_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $GITLAB_TOKEN write_api)
      || exit $?; export GITLAB_TOKEN
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
-   - pip install -r tasks/requirements.txt
+   - pip install -r tasks/requirements.txt --break-system-packages
?                                          ++++++++++++++++++++++++
    - inv pipeline.compare-to-itself
    stage: .pre
    tags:
    - arch:amd64
tests_macos
  tests_macos:
    after_script:
    - $CI_PROJECT_DIR/tools/ci/junit_upload.sh "junit-*-repacked.tgz"
    artifacts:
      expire_in: 2 weeks
      paths:
      - test_output.json
      - junit-*-repacked.tgz
      reports:
        junit:
        - '**/junit-out-*.xml'
      when: always
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - setup_agent_version
    rules:
    - if: $CI_COMMIT_BRANCH == "main"
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
    - if: $CI_COMMIT_BRANCH == "main" || $DEPLOY_AGENT == "true" || $RUN_ALL_BUILDS
        == "true" || $DDR_WORKFLOW_ID != null
    - if: $RUN_UNIT_TESTS == "on"
    - changes:
        compare_to: main
        paths:
        - pkg/fleet/**/*
      variables:
        FAST_TESTS: 'true'
    - allow_failure: true
      when: manual
    script:
    - "if [[ \"$(( RANDOM % 2 ))\" == \"1\" ]]; then\n  GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_1 key_b64) || exit $?; export GITHUB_KEY_B64\n  GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_1 app_id) || exit $?; export GITHUB_APP_ID\n  GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_1 installation_id) || exit $?; export GITHUB_INSTALLATION_ID\n\
      \  echo \"Using GitHub App instance 1\"\nelse\n  GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_2 key_b64) || exit $?; export GITHUB_KEY_B64\n  GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_2 app_id) || exit $?; export GITHUB_APP_ID\n  GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh\
      \ $MACOS_GITHUB_APP_2 installation_id) || exit $?; export GITHUB_INSTALLATION_ID\n\
      \  echo \"Using GitHub App instance 2\"\nfi\n"
    - $S3_CP_CMD $S3_ARTIFACTS_URI/agent-version.cache .
    - export VERSION_CACHE_CONTENT=$(cat agent-version.cache | base64 -)
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - FAST_TESTS_FLAG=""
    - if [[ "$FAST_TESTS" = "true" ]]; then FAST_TESTS_FLAG="--fast-tests true"; fi
    - inv -e github.trigger-macos --workflow-type "test" --datadog-agent-ref "$CI_COMMIT_SHA"
      --version-cache "$VERSION_CACHE_CONTENT" $FAST_TESTS_FLAG --test-washer $COVERAGE_CACHE_FLAG
    stage: source_test
    tags:
    - arch:amd64
    timeout: 6h
tests_macos_gitlab_amd64
  tests_macos_gitlab_amd64:
    after_script:
    - $CI_PROJECT_DIR/tools/ci/junit_upload.sh
    - CODECOV_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $CODECOV_TOKEN) || exit
      $?; export CODECOV_TOKEN
    - inv -e coverage.upload-to-codecov $COVERAGE_CACHE_FLAG || true
    allow_failure: true
    artifacts:
      expire_in: 2 weeks
      paths:
      - $TEST_OUTPUT_FILE
      - junit-*.tgz
      reports:
        annotations:
        - $EXTERNAL_LINKS_PATH
        junit:
        - '**/junit-out-*.xml'
      when: always
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - when: on_success
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e gitlab.generate-ci-visibility-links --output=$EXTERNAL_LINKS_PATH
    - FAST_TESTS_FLAG=""
    - if [[ "$FAST_TESTS" == "true" ]]; then FAST_TESTS_FLAG="--only-impacted-packages";
      fi
    - inv -e test --rerun-fails=2 --race --profile --cpus 12 --save-result-json $TEST_OUTPUT_FILE
      --junit-tar "junit-${CI_JOB_NAME}.tgz" $FAST_TESTS_FLAG --test-washer
    - inv -e invoke-unit-tests
    stage: source_test
    tags:
    - macos:monterey-amd64
    - specific:true
    variables:
      TEST_OUTPUT_FILE: test_output.json
tests_macos_gitlab_arm64
  tests_macos_gitlab_arm64:
    after_script:
    - $CI_PROJECT_DIR/tools/ci/junit_upload.sh
    - CODECOV_TOKEN=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $CODECOV_TOKEN) || exit
      $?; export CODECOV_TOKEN
    - inv -e coverage.upload-to-codecov $COVERAGE_CACHE_FLAG || true
    allow_failure: true
    artifacts:
      expire_in: 2 weeks
      paths:
      - $TEST_OUTPUT_FILE
      - junit-*.tgz
      reports:
        annotations:
        - $EXTERNAL_LINKS_PATH
        junit:
        - '**/junit-out-*.xml'
      when: always
    before_script:
    - 'eval $(gimme $(cat .go-version))
  
      export GOPATH=$GOROOT
  
      '
    - PYTHON_VERSION=$(python3 --version | awk '{print $2}')
    - VENV_NAME="datadog-agent-python-$PYTHON_VERSION"
    - VENV_PATH="$(pyenv root)/versions/$VENV_NAME"
    - echo "Using Python $PYTHON_VERSION..."
    - "# Check if the virtual environment directory exists\nif [ ! -d \"$VENV_PATH\"\
      \ ]; then\n  echo \"Creating virtual environment '$VENV_NAME'...\"\n  pyenv virtualenv\
      \ \"$PYTHON_VERSION\" \"$VENV_NAME\"\nelse\n  echo \"Virtual environment '$VENV_NAME'\
      \ already exists. Skipping creation.\"\nfi\n"
    - pyenv activate $VENV_NAME
    - 'echo "Don''t forget to regularly delete Go unused versions. Here are the installed
      Go versions and their disk space on the runner:"
  
      echo "Go:"
  
      du -sh $HOME/.gimme/versions/*
  
      echo "To remove a Go version please run:"
  
      echo "gimme uninstall <version>"
  
      '
    - 'echo "Don''t forget to regularly delete Python unused versions. Here are the
      installed Python versions and their disk space on the runner:"
  
      echo "Python:"
  
      du -sh $(pyenv root)/versions/*
  
      echo "To remove a Python version please run:"
  
      echo "pyenv uninstall -f <version>"
  
      '
    - python3 -m pip install -r requirements.txt -r tasks/libs/requirements-github.txt
+     --break-system-packages
    - pyenv rehash
    - inv -e rtloader.make
    - inv -e rtloader.install
    - inv -e install-tools
    needs:
    - go_deps
    - go_tools_deps
    rules:
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - allow_failure: true
      when: manual
    script:
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache.tar.xz
    - mkdir -p $GOPATH/pkg/mod/cache && tar xJf modcache_tools.tar.xz -C $GOPATH/pkg/mod/cache
    - rm -f modcache_tools.tar.xz
    - inv -e gitlab.generate-ci-visibility-links --output=$EXTERNAL_LINKS_PATH
    - FAST_TESTS_FLAG=""
    - if [[ "$FAST_TESTS" == "true" ]]; then FAST_TESTS_FLAG="--only-impacted-packages";
      fi
    - inv -e test --rerun-fails=2 --race --profile --cpus 12 --save-result-json $TEST_OUTPUT_FILE
      --junit-tar "junit-${CI_JOB_NAME}.tgz" $FAST_TESTS_FLAG --test-washer
    - inv -e invoke-unit-tests
    stage: source_test
    tags:
    - macos:monterey-arm64
    - specific:true
    variables:
      TEST_OUTPUT_FILE: test_output.json
unit_tests_notify
  unit_tests_notify:
    allow_failure: true
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - tests_deb-x64-py3
    - tests_deb-arm64-py3
    - tests_rpm-x64-py3
    - tests_rpm-arm64-py3
    - tests_windows-x64
    - tests_flavor_iot_deb-x64
    - tests_flavor_dogstatsd_deb-x64
    - tests_flavor_heroku_deb-x64
    rules:
    - if: $CI_COMMIT_BRANCH == "main"
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^[0-9]+\.[0-9]+\.x$/
      when: never
    - if: $CI_COMMIT_BRANCH =~ /^mq-working-branch-/
      when: never
    - if: $RUN_UNIT_TESTS == "off"
      when: never
    - when: always
    script:
-   - python3 -m pip install -r tasks/libs/requirements-github.txt
+   - python3 -m pip install -r tasks/libs/requirements-github.txt --break-system-packages
?                                                                 ++++++++++++++++++++++++
    - GITHUB_KEY_B64=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP key_b64)
      || exit $?; export GITHUB_KEY_B64
    - GITHUB_APP_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP app_id)
      || exit $?; export GITHUB_APP_ID
    - GITHUB_INSTALLATION_ID=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $AGENT_GITHUB_APP
      installation_id) || exit $?; export GITHUB_INSTALLATION_ID
    - echo "Using agent GitHub App"
    - inv notify.unit-tests --pipeline-id $CI_PIPELINE_ID --pipeline-url $CI_PIPELINE_URL
      --branch-name $CI_COMMIT_REF_NAME
    stage: source_test
    tags:
    - arch:amd64
update_rc_build_links
  update_rc_build_links:
    image: registry.ddbuild.io/ci/datadog-agent-buildimages/deb_x64$DATADOG_AGENT_BUILDIMAGES_SUFFIX:$DATADOG_AGENT_BUILDIMAGES
    needs:
    - artifacts: false
      job: docker_trigger_internal
    rules:
    - if: $RC_BUILD == "true"
    script:
    - ATLASSIAN_PASSWORD=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      token) || exit $?; export ATLASSIAN_PASSWORD
    - ATLASSIAN_USERNAME=$($CI_PROJECT_DIR/tools/ci/fetch_secret.sh $ATLASSIAN_WRITE
      user) || exit $?; export ATLASSIAN_USERNAME
-   - python3 -m pip install -r tasks/requirements_release_tasks.txt
+   - python3 -m pip install -r tasks/requirements_release_tasks.txt --break-system-packages
?                                                                   ++++++++++++++++++++++++
    - PATCH=$(echo "$CI_COMMIT_REF_NAME" | cut -d'.' -f3 | cut -c1)
    - if [[ "$PATCH" == "0" ]]; then PATCH_OPTION=""; else PATCH_OPTION="-p"; fi
    - inv -e release.update-build-links ${CI_COMMIT_REF_NAME} ${PATCH_OPTION}
    stage: post_rc_build
    tags:
    - arch:amd64

Changes Summary

Removed Modified Added Renamed
0 28 0 0

ℹ️ Diff available in the job log.

Copy link

cit-pr-commenter bot commented Oct 10, 2024

Regression Detector

Regression Detector Results

Metrics dashboard
Target profiles
Run ID: 7a36adc2-8ee2-4ad5-b445-2ff0349ff340

Baseline: 80ece75
Comparison: 4e504cc
Diff

Optimization Goals: ❌ Significant changes detected

perf experiment goal Δ mean % Δ mean % CI trials links
pycheck_lots_of_tags % cpu utilization -6.83 [-10.08, -3.57] 1 Logs

Fine details of change detection per experiment

perf experiment goal Δ mean % Δ mean % CI trials links
basic_py_check % cpu utilization +2.99 [-0.85, +6.83] 1 Logs
tcp_syslog_to_blackhole ingress throughput +0.04 [-0.01, +0.10] 1 Logs
file_to_blackhole_0ms_latency egress throughput +0.00 [-0.42, +0.42] 1 Logs
tcp_dd_logs_filter_exclude ingress throughput -0.00 [-0.01, +0.01] 1 Logs
file_to_blackhole_100ms_latency egress throughput -0.00 [-0.26, +0.25] 1 Logs
uds_dogstatsd_to_api ingress throughput -0.02 [-0.09, +0.06] 1 Logs
file_to_blackhole_300ms_latency egress throughput -0.04 [-0.23, +0.15] 1 Logs
file_to_blackhole_1000ms_latency egress throughput -0.08 [-0.56, +0.41] 1 Logs
file_to_blackhole_500ms_latency egress throughput -0.09 [-0.33, +0.16] 1 Logs
quality_gate_idle memory utilization -0.41 [-0.46, -0.36] 1 Logs bounds checks dashboard
uds_dogstatsd_to_api_cpu % cpu utilization -0.45 [-1.17, +0.28] 1 Logs
quality_gate_idle_all_features memory utilization -0.53 [-0.66, -0.41] 1 Logs bounds checks dashboard
idle_all_features memory utilization -0.59 [-0.69, -0.49] 1 Logs bounds checks dashboard
idle memory utilization -0.78 [-0.83, -0.74] 1 Logs bounds checks dashboard
file_tree memory utilization -1.21 [-1.35, -1.06] 1 Logs
pycheck_lots_of_tags % cpu utilization -6.83 [-10.08, -3.57] 1 Logs

Bounds Checks: ❌ Failed

perf experiment bounds_check_name replicates_passed links
file_to_blackhole_0ms_latency lost_bytes 6/10
file_to_blackhole_100ms_latency lost_bytes 7/10
quality_gate_idle memory_usage 7/10 bounds checks dashboard
idle memory_usage 9/10 bounds checks dashboard
file_to_blackhole_0ms_latency memory_usage 10/10
file_to_blackhole_1000ms_latency memory_usage 10/10
file_to_blackhole_100ms_latency memory_usage 10/10
file_to_blackhole_300ms_latency memory_usage 10/10
file_to_blackhole_500ms_latency memory_usage 10/10
idle_all_features memory_usage 10/10 bounds checks dashboard
quality_gate_idle_all_features memory_usage 10/10 bounds checks dashboard

Explanation

Confidence level: 90.00%
Effect size tolerance: |Δ mean %| ≥ 5.00%

Performance changes are noted in the perf column of each table:

  • ✅ = significantly better comparison variant performance
  • ❌ = significantly worse comparison variant performance
  • ➖ = no significant change in performance

A regression test is an A/B test of target performance in a repeatable rig, where "performance" is measured as "comparison variant minus baseline variant" for an optimization goal (e.g., ingress throughput). Due to intrinsic variability in measuring that goal, we can only estimate its mean value for each experiment; we report uncertainty in that value as a 90.00% confidence interval denoted "Δ mean % CI".

For each experiment, we decide whether a change in performance is a "regression" -- a change worth investigating further -- if all of the following criteria are true:

  1. Its estimated |Δ mean %| ≥ 5.00%, indicating the change is big enough to merit a closer look.

  2. Its 90.00% confidence interval "Δ mean % CI" does not contain zero, indicating that if our statistical model is accurate, there is at least a 90.00% chance there is a difference in performance between baseline and comparison variants.

  3. Its configuration does not mark it "erratic".

@Kyle-Neale Kyle-Neale force-pushed the kyle.neale/test-build-image-updates branch from 0fde749 to e6f93fe Compare October 11, 2024 18:57
@Kyle-Neale Kyle-Neale force-pushed the kyle.neale/test-build-image-updates branch from 7a5f48f to 70d865b Compare October 28, 2024 19:23
@github-actions github-actions bot added the short review PR is simple enough to be reviewed quickly label Oct 29, 2024
@Kyle-Neale Kyle-Neale changed the title Test build image python 3.12.6 updates Bump build images for python 3.12.6 updates Nov 7, 2024
@Kyle-Neale Kyle-Neale marked this pull request as ready for review November 7, 2024 20:55
@Kyle-Neale Kyle-Neale requested a review from a team as a code owner November 7, 2024 20:55
Copy link
Contributor

@CelianR CelianR left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, if #30332 is merged here 👍

@steveny91 steveny91 requested review from a team as code owners November 8, 2024 16:10
@github-actions github-actions bot added long review PR is complex, plan time to review it and removed short review PR is simple enough to be reviewed quickly labels Nov 8, 2024
@Kyle-Neale
Copy link
Contributor Author

/merge

@dd-devflow
Copy link

dd-devflow bot commented Nov 8, 2024

Devflow running: /merge

View all feedbacks in Devflow UI.


2024-11-08 18:27:07 UTC ℹ️ MergeQueue: pull request added to the queue

The median merge time in main is 24m.

@dd-mergequeue dd-mergequeue bot merged commit 40572e8 into main Nov 8, 2024
297 checks passed
@dd-mergequeue dd-mergequeue bot deleted the kyle.neale/test-build-image-updates branch November 8, 2024 19:04
@github-actions github-actions bot added this to the 7.61.0 milestone Nov 8, 2024
@pducolin pducolin added the qa/done QA done before merge and regressions are covered by tests label Dec 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
changelog/no-changelog long review PR is complex, plan time to review it qa/done QA done before merge and regressions are covered by tests team/agent-devx-infra
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants