Running benchmarks with the auto builders

This section describes how to run benchmark using the auto builders.

Quick start

  • Make a merge request
  • Wait for cbuild to take the snapshot (up to 60 minutes)
  • Record the version from the merge request. Something like gcc-linaro-4.7+bzr114975~uweigand~vecsetextractmem-4.7

  • Wait for the build to finish
  • Go to http://cbuild.validation.linaro.org/helpers/scheduler/spawn

  • Enter benchmarks-gcc-$version or benchmarks-spec2000-gcc-$version

  • Tick 'a9hf-ref'
  • Click Submit

Spawn a job

http://cbuild.validation.linaro.org/helpers/scheduler/spawn

Merge requests are automatically built. Otherwise, drop arbitrary tarballs into cbuild@toolchain64.lab:~cbuild/var/snapshots and cd ~/lib/cbuild-tools; ./spawn path-to-tarball.

For example:

scp gcc-4.6.3.tar.gz cbuild@toolchain64.lab:~/var/snapshots; spawn gcc-4.6.3 into a9-builder

After starting a job, it will show up in the scheduler: http://cbuild.validation.linaro.org/helpers/scheduler

Jobs

  • gcc-version - build and test GCC
  • benchmarks-gcc-version - run coremark, denbench, eembc against the already built version
  • benchmarks-spec2000-gcc-version - run spec2000

Queues

  • a9-builders: anything that can naive build a A9 compiler
  • a9hf-ref: reference A9 hard float boards
  • a9-ref: reference A9 softfp boards
  • a8-ref: reference A8 boards

Results

Results end up in http://cbuild.validation.linaro.org/benchmarks/. Go to gcc-$version/logs/arm* for actual results. See the internal page for the username and password.

benchmarks.txt is automatically generated on any new results.

To compare two runs, checkout lp:linaro-toolchain-benchmarks and run scripts/diffbench.py http://cbuild.validation.linaro.org/benchmarks/baseline-version/logs/arm*/benchmarks.txt http://cbuild.validation.linaro.org/benchmarks/your-version/logs/arm*/benchmarks.txt

See the README in linaro-toolchain-benchmarks for more.

Variables

  • BENCHMARKS = list, such as coremark spec2000 pybench - run these benchmarks instead of the defaults

Variants:

Running benchmarks manually

This section describes how to run benchmarks manually using the cbuild scripts. (Some of this is specific for running on the toolchain validation boards, but most of the instructions applies to the general case.)

Log in to a board

ssh ursa2.ex
# longer version...
ssh -p 7022 asa-san@cbuild.validation.linaro.org
# screen makes sure your work continues even if the connection is lost
screen

Set up a few things. (One time stuff.)

apt-get install ccrypt bzr time ...
mkdir -p ~/.config/cbuild
echo secretPassword > ~/.config/cbuild/password

# preferably create your build folder on /scratch
mkdir bench; cd bench
bzr branch lp:cbuild

# put the test files in the cbuild folder
tar -xvf ../../files.tar

# Create a slave directory for this host
mkdir -p slaves/$(hostname); cd slaves/$(hostname)

# Create a local.mk and add some basic stuff. Add more configuration later if needed
echo "TOPDIR=../.." > local.mk
echo "CONFIG=cortexa9" >> local.mk
echo "PUBLISH_URL=some-local-publish-dir" >> local.mk

Commands for running the benchmarks

cd cbuild/slaves/$(hostname)

# EEMBC with default toolchain and configuration
make -f ../../lib/eembc.mk

# EEMBC with a specific toolchain
make -f ../../lib/build.mk gcc-linaro-4.6-2011.10/benchmarks.stamp BENCHMARKS=eembc

# SPEC2000 with default configuration, executing a set of tests with full workload
make -f ../../lib/spec2000.mk  TESTS="int 177 179 183 188" WORKLOAD=ref

# Use the gcc in the system for coremark, default variants.
make -f ../../lib/build.mk gcc-native/benchmarks.stamp BENCHMARKS=coremark;

Variants

  • For running more than one variant in a batch, create a file containing the variants you like and set the DEFAULT_VARIANTS variable to point to that file.

cp ../../lib/mlh1-variants.mk ../../lib/asa-variants.mk
# chose variants
vi ../../lib/asa-variants.mk
echo "DEFAULT_VARIANTS=asa-variants" >> local.mk
  • The variants in DEFAULT_VARIANTS are picked automatically from build.mk.

Results

# Install dependencies if needed
sudo apt-get install python-numpy
# The branch with processing scripts
bzr branch lp:linaro-toolchain-benchmarks

# Get the results from the board
scp -r asa-san@ursa4.ex://home/asa-san/bench/cbuild/slaves/ursa4/some-local-publish-dir results

# Tabulate
cd results
find -name *run* -execdir cp '{}' ../.. ';'
python ~/linaro/benchs/linaro-toolchain-benchmarks/scripts/tabulate.py  gcc*/*run.txt> tmp.csv

4.4.x compilers

  • To avoid build errors for c++ tests when using 4.4.x compilers, remove the LD_LIBRARY_PATH from build.mk (the gcc-%/benchmarks.stamp target) and run the C only tests.

WorkingGroups/ToolChain/Benchmarks/RunningBenchmarksWithCbuild (last modified 2012-12-04 23:01:58)