[dpdk-stable] [dpdk-dev] [PATCH v1 1/1] ci: enable DPDK GHA for arm64 with self-hosted runners

Michael Santana msantana at redhat.com
Wed Oct 13 13:32:11 CEST 2021



On 10/13/21 4:03 AM, Serena He wrote:
> CI jobs are triggered only for repos installed with given GHApp and runners
> 
> Cc: stable at dpdk.org
> 
> Signed-off-by: Serena He <serena.he at arm.com>
> 
> ---
>   .github/workflows/build-arm64.yml | 118 ++++++++++++++++++++++++++++++
>   1 file changed, 118 insertions(+)
>   create mode 100644 .github/workflows/build-arm64.yml
> 
> diff --git a/.github/workflows/build-arm64.yml b/.github/workflows/build-arm64.yml
> new file mode 100644
> index 0000000000..570563f7c8
> --- /dev/null
> +++ b/.github/workflows/build-arm64.yml
Adding a new workflow should work on our 0-day-bot. We now support 
having multiple workflows so this looks good
> @@ -0,0 +1,118 @@
> +name: build-arm64
> +
> +on:
> +  push:
> +  schedule:
> +    - cron: '0 0 * * 1'
nit: Please add a comment for when this is scheduled so we dont have to 
do cron math :)
> +
> +defaults:
> +  run:
> +    shell: bash --noprofile --norc -exo pipefail {0}
> +
> +jobs:
> +  build:
> +    # Here, runners for arm64 are accessed by installed GitHub APP, thus will not be available by fork.
> +    # you can change the following 'if' and 'runs-on' if you have your own runners installed.
> +    # or request to get your repo on the whitelist to use GitHub APP and delete this 'if'.
I think I understand. I think you mean s/GitHub APP/GitHub/ . otherwise 
I dont know what that is. From my understanding you had to request 
special arm-based runners from github

Are DPDK/dpdk and ovsrobot/dpdk whitelisted to use the arm-based runners?

Maybe there was a thread about this in the past that I missed, but where 
and how do you get these arm-based runners from github?
> +    if: ${{ github.repository == 'DPDK/dpdk' || github.repository == 'ovsrobot/dpdk' }}
> +    name: ${{ join(matrix.config.*, '-') }}
> +    runs-on: ${{ matrix.config.os }}
> +    env:
> +      ABI_CHECKS: ${{ contains(matrix.config.checks, 'abi') }}
> +      BUILD_DOCS: ${{ contains(matrix.config.checks, 'doc') }}
> +      CL: ${{ matrix.config.compiler == 'clang' }}
> +      CC: ccache ${{ matrix.config.compiler }}
> +      DEF_LIB: ${{ matrix.config.library }}
> +      LIBABIGAIL_VERSION: libabigail-1.8
> +      REF_GIT_TAG: none
> +
> +    strategy:
> +      fail-fast: false
> +      matrix:
> +        config:
> +          - os: [self-hosted,arm-ubuntu-20.04]
> +            compiler: gcc
> +            library: static
> +          - os: [self-hosted,arm-ubuntu-20.04]
> +            compiler: gcc
> +            library: shared
> +            checks: doc+tests
> +          - os: [self-hosted,arm-ubuntu-20.04]
> +            compiler: clang
> +            library: static
> +          - os: [self-hosted,arm-ubuntu-20.04]
> +            compiler: clang
> +            library: shared
> +            checks: doc+tests
> +
> +    steps:
> +    - name: Checkout sources
> +      uses: actions/checkout at v2
> +    - name: Generate cache keys
> +      id: get_ref_keys
> +      run: |
> +        echo -n '::set-output name=ccache::'
> +        echo 'ccache-${{ matrix.config.os }}-${{ matrix.config.compiler }}-${{ matrix.config.cross }}-'$(date -u +%Y-w%W)
> +        echo -n '::set-output name=libabigail::'
> +        echo 'libabigail-${{ matrix.config.os }}'
> +        echo -n '::set-output name=abi::'
> +        echo 'abi-${{ matrix.config.os }}-${{ matrix.config.compiler }}-${{ matrix.config.cross }}-${{ env.LIBABIGAIL_VERSION }}-${{ env.REF_GIT_TAG }}'
> +    - name: Retrieve ccache cache
> +      uses: actions/cache at v2
> +      with:
> +        path: ~/.ccache
> +        key: ${{ steps.get_ref_keys.outputs.ccache }}-${{ github.ref }}
> +        restore-keys: |
> +          ${{ steps.get_ref_keys.outputs.ccache }}-refs/heads/main
> +    - name: Retrieve libabigail cache
> +      id: libabigail-cache
> +      uses: actions/cache at v2
> +      if: env.ABI_CHECKS == 'true'
> +      with:
> +        path: libabigail
> +        key: ${{ steps.get_ref_keys.outputs.libabigail }}
> +    - name: Retrieve ABI reference cache
> +      uses: actions/cache at v2
> +      if: env.ABI_CHECKS == 'true'
> +      with:
> +        path: reference
> +        key: ${{ steps.get_ref_keys.outputs.abi }}
> +    - name: Update APT cache
> +      run: sudo apt update || true
> +    - name: Install packages
> +      run: sudo apt install -y ccache libnuma-dev python3-setuptools
> +        python3-wheel python3-pip python3-pyelftools ninja-build libbsd-dev
> +        libpcap-dev libibverbs-dev libcrypto++-dev libfdt-dev libjansson-dev
> +        libarchive-dev zlib1g-dev pkgconf
> +    - name: Install libabigail build dependencies if no cache is available
> +      if: env.ABI_CHECKS == 'true' && steps.libabigail-cache.outputs.cache-hit != 'true'
> +      run: sudo apt install -y autoconf automake libtool pkg-config libxml2-dev
> +          libdw-dev
Lots of caching stuff. All of it needed?
> +
> +    - name: Install test tools packages
> +      run: sudo apt install -y gdb
> +    - name: Install doc generation packages
> +      if: env.BUILD_DOCS == 'true'
> +      run: sudo apt install -y doxygen graphviz python3-sphinx
> +        python3-sphinx-rtd-theme
> +    - name: Run setup
> +      run: |
> +        .ci/linux-setup.sh
> +        # Workaround on $HOME permissions as EAL checks them for plugin loading
> +        chmod o-w $HOME
> +    - name: Install clang
> +      if: env.CL == 'true'
> +      run: sudo apt install -y clang
> +    - name: Build and test
> +      run: .ci/linux-build.sh
> +    - name: Upload logs on failure
> +      if: failure()
> +      uses: actions/upload-artifact at v2
> +      with:
> +        name: meson-logs-${{ join(matrix.config.*, '-') }}
> +        path: |
> +          build/meson-logs/testlog.txt
> +          build/.ninja_log
> +          build/meson-logs/meson-log.txt
> +          build/gdb.log
LGTM!
> +
> 



More information about the stable mailing list