=============================================================================== About this build: this rebuild has been done as part of reproduce.debian.net where we aim to reproduce Debian binary packages distributed via ftp.debian.org, by rebuilding using the exact same packages as the original build on the buildds, as described in the relevant .buildinfo file from buildinfos.debian.net. For more information please go to https://reproduce.debian.net or join #debian-reproducible on irc.debian.org =============================================================================== Preparing download of sources for /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1_riscv64.buildinfo Source: slepc Version: 3.24.1+dfsg1-1 rebuilderd-worker node: riscv64-34 +------------------------------------------------------------------------------+ | Downloading sources Mon, 05 Jan 2026 10:22:04 +0000 | +------------------------------------------------------------------------------+ Get:1 https://deb.debian.org/debian trixie InRelease [140 kB] Get:2 https://deb.debian.org/debian-security trixie-security InRelease [43.4 kB] Get:3 https://deb.debian.org/debian trixie-updates InRelease [47.3 kB] Get:4 https://deb.debian.org/debian trixie-proposed-updates InRelease [57.6 kB] Get:5 https://deb.debian.org/debian trixie-backports InRelease [54.0 kB] Get:6 https://deb.debian.org/debian forky InRelease [137 kB] Get:7 https://deb.debian.org/debian sid InRelease [187 kB] Get:8 https://deb.debian.org/debian experimental InRelease [91.1 kB] Get:9 https://deb.debian.org/debian trixie/non-free-firmware Sources [6,548 B] Get:10 https://deb.debian.org/debian trixie/main Sources [10.5 MB] Get:11 https://deb.debian.org/debian-security trixie-security/main Sources [117 kB] Get:12 https://deb.debian.org/debian-security trixie-security/non-free-firmware Sources [696 B] Get:13 https://deb.debian.org/debian trixie-updates/main Sources [2,788 B] Get:14 https://deb.debian.org/debian trixie-proposed-updates/non-free-firmware Sources [692 B] Get:15 https://deb.debian.org/debian trixie-proposed-updates/main Sources [182 kB] Get:16 https://deb.debian.org/debian trixie-backports/main Sources [122 kB] Get:17 https://deb.debian.org/debian trixie-backports/non-free-firmware Sources [2,468 B] Get:18 https://deb.debian.org/debian forky/non-free-firmware Sources [7,696 B] Get:19 https://deb.debian.org/debian forky/main Sources [10.6 MB] Get:20 https://deb.debian.org/debian sid/main Sources [11.2 MB] Get:21 https://deb.debian.org/debian sid/non-free-firmware Sources [9,692 B] Get:22 https://deb.debian.org/debian experimental/non-free-firmware Sources [3,180 B] Get:23 https://deb.debian.org/debian experimental/main Sources [355 kB] Fetched 34.0 MB in 11s (3,041 kB/s) Reading package lists... 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.1%2bdfsg1-1.dsc' slepc_3.24.1+dfsg1-1.dsc 3454 SHA256:73a654f071293b1bc48ff88503ab8d1be24c9651530a2a8d2998629d48455022 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.1%2bdfsg1.orig.tar.xz' slepc_3.24.1+dfsg1.orig.tar.xz 23597012 SHA256:ba48811ad927c83ec7ce9eb52c4fcfd97bb055ec5b269ac1b452a677f67d1a0a 'https://deb.debian.org/debian/pool/main/s/slepc/slepc_3.24.1%2bdfsg1-1.debian.tar.xz' slepc_3.24.1+dfsg1-1.debian.tar.xz 21360 SHA256:d169298a7f30dc194d764e4a0ae11913bd1472a65df77f80f3cf0599cc067b64 ba48811ad927c83ec7ce9eb52c4fcfd97bb055ec5b269ac1b452a677f67d1a0a slepc_3.24.1+dfsg1.orig.tar.xz d169298a7f30dc194d764e4a0ae11913bd1472a65df77f80f3cf0599cc067b64 slepc_3.24.1+dfsg1-1.debian.tar.xz 73a654f071293b1bc48ff88503ab8d1be24c9651530a2a8d2998629d48455022 slepc_3.24.1+dfsg1-1.dsc +------------------------------------------------------------------------------+ | Calling debrebuild Mon, 05 Jan 2026 10:22:19 +0000 | +------------------------------------------------------------------------------+ Rebuilding slepc=3.24.1+dfsg1-1 in /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs now. + nice /usr/bin/debrebuild --buildresult=/srv/rebuilderd/tmp/rebuilderd1Mmvtg/out --builder=sbuild+unshare --cache=/srv/rebuilderd/cache -- /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1_riscv64.buildinfo /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1_riscv64.buildinfo contains a GPG signature which has NOT been validated Using defined Build-Path: /build/reproducible-path/slepc-3.24.1+dfsg1 I: verifying dsc... successful! Get:1 http://deb.debian.org/debian unstable InRelease [187 kB] Get:2 http://snapshot.debian.org/archive/debian/20251120T202450Z sid InRelease [176 kB] Get:3 http://snapshot.debian.org/archive/debian/20251124T032852Z sid InRelease [176 kB] Get:4 http://deb.debian.org/debian unstable/main riscv64 Packages [9821 kB] Get:5 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 Packages [9833 kB] Get:6 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 Packages [9832 kB] Fetched 30.0 MB in 9s (3523 kB/s) Reading package lists... W: http://snapshot.debian.org/archive/debian/20251120T202450Z/dists/sid/InRelease: Loading /etc/apt/trusted.gpg from deprecated option Dir::Etc::Trusted W: http://snapshot.debian.org/archive/debian/20251124T032852Z/dists/sid/InRelease: Loading /etc/apt/trusted.gpg from deprecated option Dir::Etc::Trusted Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libitm1 riscv64 15.2.0-8 [25.4 kB] Fetched 25.4 kB in 0s (403 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg_ncvrnz/libitm1_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgnutls-openssl27t64 riscv64 3.8.10-3 [456 kB] Fetched 456 kB in 0s (7258 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9o97lqtm/libgnutls-openssl27t64_3.8.10-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmagic-mgc riscv64 1:5.46-5 [338 kB] Fetched 338 kB in 0s (12.1 MB/s) dpkg-name: info: moved 'libmagic-mgc_1%3a5.46-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpx9mls_ao/libmagic-mgc_5.46-5_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgfortran-15-dev riscv64 15.2.0-8 [1311 kB] Fetched 1311 kB in 0s (16.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy4u3lbl0/libgfortran-15-dev_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxcb1 riscv64 1.17.0-2+b1 [145 kB] Fetched 145 kB in 0s (2489 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmuo4yzyv/libxcb1_1.17.0-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 media-types all 14.0.0 [30.8 kB] Fetched 30.8 kB in 0s (1476 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxmxfiyck/media-types_14.0.0_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 zlib1g-dev riscv64 1:1.3.dfsg+really1.3.1-1+b1 [978 kB] Fetched 978 kB in 0s (23.6 MB/s) dpkg-name: info: moved 'zlib1g-dev_1%3a1.3.dfsg+really1.3.1-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp_lp3hs_0/zlib1g-dev_1.3.dfsg+really1.3.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 xtrans-dev all 1.6.0-1 [93.5 kB] Fetched 93.5 kB in 0s (4225 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpns_30go1/xtrans-dev_1.6.0-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-dev riscv64 2:1.8.12-1 [1408 kB] Fetched 1408 kB in 0s (27.9 MB/s) dpkg-name: info: moved 'libx11-dev_2%3a1.8.12-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpmr57e_kl/libx11-dev_1.8.12-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbsd0 riscv64 0.12.2-2 [132 kB] Fetched 132 kB in 0s (5655 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw930k2qm/libbsd0_0.12.2-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-dev riscv64 5.8.1-2 [7361 kB] Fetched 7361 kB in 0s (35.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpe2u16mrx/libmumps-dev_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnuma-dev riscv64 2.0.19-1 [66.2 kB] Fetched 66.2 kB in 0s (1216 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphrxlpxg8/libnuma-dev_2.0.19-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibverbs-dev riscv64 56.1-1+b1 [1805 kB] Fetched 1805 kB in 0s (19.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt_7tofn_/libibverbs-dev_56.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-pthreads-2.1-7t64 riscv64 2.1.12-stable-10+b1 [54.1 kB] Fetched 54.1 kB in 0s (986 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpznsxnpyc/libevent-pthreads-2.1-7t64_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcrypt-dev riscv64 1:4.5.1-1 [262 kB] Fetched 262 kB in 0s (10.0 MB/s) dpkg-name: info: moved 'libcrypt-dev_1%3a4.5.1-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmppfxjaopo/libcrypt-dev_4.5.1-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libjansson4 riscv64 2.14-2+b3 [40.7 kB] Fetched 40.7 kB in 0s (1925 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5w3hb5ba/libjansson4_2.14-2+b3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sysvinit-utils riscv64 3.15-6 [34.6 kB] Fetched 34.6 kB in 0s (1657 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl0vh6oyt/sysvinit-utils_3.15-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl riscv64 5.40.1-7 [267 kB] Fetched 267 kB in 0s (10.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp33wmq73n/perl_5.40.1-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libxml2-16 riscv64 2.15.1+dfsg-0.4 [637 kB] Fetched 637 kB in 0s (9526 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8i6p8s7y/libxml2-16_2.15.1+dfsg-0.4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcap2 riscv64 1:2.75-10+b1 [29.3 kB] Fetched 29.3 kB in 0s (1402 kB/s) dpkg-name: info: moved 'libcap2_1%3a2.75-10+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpwnhtvxkt/libcap2_2.75-10+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libmount1 riscv64 2.41.2-4 [226 kB] Fetched 226 kB in 0s (8935 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7x_1nuq9/libmount1_2.41.2-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dwz riscv64 0.16-2 [115 kB] Fetched 115 kB in 0s (5022 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4dky9k_d/dwz_0.16-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas-dev riscv64 3.12.1-7 [291 kB] Fetched 291 kB in 0s (6250 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmu1bof_b/libblas-dev_3.12.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsasl2-modules-db riscv64 2.1.28+dfsg1-10 [20.2 kB] Fetched 20.2 kB in 0s (969 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppb1djbmq/libsasl2-modules-db_2.1.28+dfsg1-10_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libexpat1 riscv64 2.7.3-1 [106 kB] Fetched 106 kB in 0s (4692 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5b0h26vj/libexpat1_2.7.3-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhogweed6t64 riscv64 3.10.2-1 [336 kB] Fetched 336 kB in 0s (12.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptfdqjlg4/libhogweed6t64_3.10.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-openssl-2.1-7t64 riscv64 2.1.12-stable-10+b1 [60.6 kB] Fetched 60.6 kB in 0s (1110 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxz8hzhi1/libevent-openssl-2.1-7t64_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libtool all 2.5.4-7 [540 kB] Fetched 540 kB in 0s (15.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpb_yic8f8/libtool_2.5.4-7_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libstdc++6 riscv64 15.2.0-8 [714 kB] Fetched 714 kB in 0s (10.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw2oxoh1w/libstdc++6_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gettext riscv64 0.23.2-1 [1684 kB] Fetched 1684 kB in 0s (18.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpodk6viys/gettext_0.23.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libldl3 riscv64 1:7.11.0+dfsg-2 [33.8 kB] Fetched 33.8 kB in 0s (626 kB/s) dpkg-name: info: moved 'libldl3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp87ueonjh/libldl3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-double3 riscv64 3.3.10-2+b1 [378 kB] Fetched 378 kB in 0s (6085 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7kg1tx_a/libfftw3-double3_3.3.10-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libtinfo6 riscv64 6.5+20251115-2 [351 kB] Fetched 351 kB in 0s (5788 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpz654nequ/libtinfo6_6.5+20251115-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libelf1t64 riscv64 0.194-1 [190 kB] Fetched 190 kB in 1s (345 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpm2s7eoqs/libelf1t64_0.194-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libp11-kit-dev riscv64 0.25.10-1 [221 kB] Fetched 221 kB in 0s (3792 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf25l8sas/libp11-kit-dev_0.25.10-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre-2.33.0 riscv64 2.33.0-3 [1536 kB] Fetched 1536 kB in 0s (18.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbf0t_85x/libhypre-2.33.0_2.33.0-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcolamd3 riscv64 1:7.11.0+dfsg-2 [41.0 kB] Fetched 41.0 kB in 0s (744 kB/s) dpkg-name: info: moved 'libcolamd3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp9qfl6f2s/libcolamd3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libacl1 riscv64 2.3.2-2+b1 [32.9 kB] Fetched 32.9 kB in 0s (1581 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpie2dhku9/libacl1_2.3.2-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libsuperlu-dist9 riscv64 9.2.0+dfsg1-4 [705 kB] Fetched 705 kB in 0s (8887 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn__cslyq/libsuperlu-dist9_9.2.0+dfsg1-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu7 riscv64 7.0.1+dfsg1-2 [159 kB] Fetched 159 kB in 0s (2797 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsscuvju9/libsuperlu7_7.0.1+dfsg1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-modules-bin riscv64 1.7.0-5 [49.3 kB] Fetched 49.3 kB in 0s (2328 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy156s7wh/libpam-modules-bin_1.7.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcbor0.10 riscv64 0.10.2-2.1 [27.8 kB] Fetched 27.8 kB in 0s (524 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8up72yjx/libcbor0.10_0.10.2-2.1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libzstd1 riscv64 1.5.7+dfsg-2 [372 kB] Fetched 372 kB in 0s (13.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpws6u53f9/libzstd1_1.5.7+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmd0 riscv64 1.1.0-2+b1 [37.6 kB] Fetched 37.6 kB in 0s (1807 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpr_mql0qi/libmd0_1.1.0-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libpython3-stdlib riscv64 3.13.7-1 [10.2 kB] Fetched 10.2 kB in 0s (186 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9hm8cj7n/libpython3-stdlib_3.13.7-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssl-dev riscv64 3.5.4-1 [6304 kB] Fetched 6304 kB in 0s (24.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwx0qpdxi/libssl-dev_3.5.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfabric1 riscv64 2.1.0-1.1 [601 kB] Fetched 601 kB in 0s (9088 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqezflej1/libfabric1_2.1.0-1.1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsuperlu-dev riscv64 7.0.1+dfsg1-2 [22.0 kB] Fetched 22.0 kB in 0s (412 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpol8elyh7/libsuperlu-dev_7.0.1+dfsg1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libasan8 riscv64 15.2.0-8 [2939 kB] Fetched 2939 kB in 0s (22.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbk6wxykd/libasan8_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxext6 riscv64 2:1.3.4-1+b3 [51.7 kB] Fetched 51.7 kB in 0s (2386 kB/s) dpkg-name: info: moved 'libxext6_2%3a1.3.4-1+b3_riscv64.deb' to '/srv/rebuilderd/tmp/tmpiyajd0ta/libxext6_1.3.4-1+b3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc riscv64 4:15.2.0-4 [5156 B] Fetched 5156 B in 0s (252 kB/s) dpkg-name: info: moved 'gcc_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmp0_11h6xg/gcc_15.2.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-dev riscv64 3.3.10-2+b1 [1492 kB] Fetched 1492 kB in 0s (16.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpovh30pd4/libfftw3-dev_3.3.10-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dh-strip-nondeterminism all 1.15.0-1 [8812 B] Fetched 8812 B in 0s (427 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp41kt6pdk/dh-strip-nondeterminism_1.15.0-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxnvctrl0 riscv64 535.171.04-1+b2 [14.5 kB] Fetched 14.5 kB in 0s (273 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2v5ys8ap/libxnvctrl0_535.171.04-1+b2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-dev riscv64 1.16.0-1 [452 kB] Fetched 452 kB in 0s (7214 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpv5rj5ixq/libngtcp2-dev_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 rpcsvc-proto riscv64 1.4.3-1+b2 [62.3 kB] Fetched 62.3 kB in 0s (2912 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeeojhpt1/rpcsvc-proto_1.4.3-1+b2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libpmix2t64 riscv64 6.0.0+really5.0.9-2 [660 kB] Fetched 660 kB in 0s (9735 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg6_k2a9t/libpmix2t64_6.0.0+really5.0.9-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfile-libmagic-perl riscv64 1.23-2+b2 [30.9 kB] Fetched 30.9 kB in 0s (579 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp97fc23jw/libfile-libmagic-perl_1.23-2+b2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 zlib1g riscv64 1:1.3.dfsg+really1.3.1-1+b1 [85.7 kB] Fetched 85.7 kB in 0s (3953 kB/s) dpkg-name: info: moved 'zlib1g_1%3a1.3.dfsg+really1.3.1-1+b1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpx8a4p8jk/zlib1g_1.3.dfsg+really1.3.1-1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 g++-15-riscv64-linux-gnu riscv64 15.2.0-8 [15.8 MB] Fetched 15.8 MB in 0s (34.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8aliy89e/g++-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps64-dev riscv64 5.8.1-2 [7344 kB] Fetched 7344 kB in 0s (25.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnc6to3z3/libmumps64-dev_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-0 riscv64 0.3.30+ds-3 [43.5 kB] Fetched 43.5 kB in 0s (807 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpifrldu0k/libopenblas64-0_0.3.30+ds-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 debianutils riscv64 5.23.2 [91.7 kB] Fetched 91.7 kB in 0s (4203 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt3d3tsyd/debianutils_5.23.2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 build-essential riscv64 12.12 [4628 B] Fetched 4628 B in 0s (225 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppkpe7ws3/build-essential_12.12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libamd3 riscv64 1:7.11.0+dfsg-2 [49.8 kB] Fetched 49.8 kB in 0s (925 kB/s) dpkg-name: info: moved 'libamd3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp3iz9y6ra/libamd3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-headers-dev all 5.8.1-2 [36.4 kB] Fetched 36.4 kB in 0s (679 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo222zt81/libmumps-headers-dev_5.8.1-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libperl5.40 riscv64 5.40.1-7 [3952 kB] Fetched 3952 kB in 0s (38.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo5s4qej_/libperl5.40_5.40.1-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 python3-minimal riscv64 3.13.7-1 [27.2 kB] Fetched 27.2 kB in 0s (513 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf99yp_qm/python3-minimal_3.13.7-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libidn2-0 riscv64 2.3.8-4 [110 kB] Fetched 110 kB in 0s (4808 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp51l01uce/libidn2-0_2.3.8-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-fortran-310 riscv64 1.14.5+repack-4 [116 kB] Fetched 116 kB in 0s (2099 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqo73gma2/libhdf5-openmpi-fortran-310_1.14.5+repack-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 opencl-clhpp-headers all 3.0~2025.07.22-1 [51.0 kB] Fetched 51.0 kB in 0s (947 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj8z90s1p/opencl-clhpp-headers_3.0~2025.07.22-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-0-pthread riscv64 0.3.30+ds-3 [3286 kB] Fetched 3286 kB in 0s (25.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpycwt2yla/libopenblas64-0-pthread_0.3.30+ds-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 python3 riscv64 3.13.7-1 [28.3 kB] Fetched 28.3 kB in 0s (531 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcp4nwba6/python3_3.13.7-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libbz2-1.0 riscv64 1.0.8-6 [40.3 kB] Fetched 40.3 kB in 0s (1928 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnxygw8n1/libbz2-1.0_1.0.8-6_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 ncurses-base all 6.5+20250216-2 [273 kB] Fetched 273 kB in 0s (4645 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmdat9llh/ncurses-base_6.5+20250216-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre64-2.33.0 riscv64 2.33.0-3 [1468 kB] Fetched 1468 kB in 0s (17.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpng1fmc7x/libhypre64-2.33.0_2.33.0-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmetis5 riscv64 5.1.0.dfsg-8 [164 kB] Fetched 164 kB in 0s (2893 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcdwxadfo/libmetis5_5.1.0.dfsg-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libatomic1 riscv64 15.2.0-8 [8516 B] Fetched 8516 B in 0s (162 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpepqgy8vb/libatomic1_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libparu1 riscv64 1:7.11.0+dfsg-2 [72.9 kB] Fetched 72.9 kB in 0s (1343 kB/s) dpkg-name: info: moved 'libparu1_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpba9qbqgw/libparu1_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran riscv64 4:15.2.0-4 [1428 B] Fetched 1428 B in 0s (27.2 kB/s) dpkg-name: info: moved 'gfortran_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpvtyolyi0/gfortran_15.2.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp2-14 riscv64 1.64.0-1.1+b1 [78.1 kB] Fetched 78.1 kB in 0s (3588 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpixx2tom9/libnghttp2-14_1.64.0-1.1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack64-3 riscv64 3.12.1-7 [1931 kB] Fetched 1931 kB in 0s (20.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6jiwxryr/liblapack64-3_3.12.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Fetched 22.9 kB in 0s (1123 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqixnzz6h/intltool-debian_0.35.0+20060710.6_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssh2-1-dev riscv64 1.11.1-1 [560 kB] Fetched 560 kB in 0s (8636 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppr300xjj/libssh2-1-dev_1.11.1-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgnutls30t64 riscv64 3.8.10-3 [1465 kB] Fetched 1465 kB in 0s (17.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2cz3rpb0/libgnutls30t64_3.8.10-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-6 riscv64 2:1.8.12-1 [817 kB] Fetched 817 kB in 0s (21.5 MB/s) dpkg-name: info: moved 'libx11-6_2%3a1.8.12-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpwezbg7pb/libx11-6_1.8.12-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 dpkg-dev all 1.22.21 [1338 kB] Fetched 1338 kB in 0s (27.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8bffhai1/dpkg-dev_1.22.21_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 binutils riscv64 2.45-8 [269 kB] Fetched 269 kB in 0s (4598 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp25esamwr/binutils_2.45-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libx11-data all 2:1.8.12-1 [343 kB] Fetched 343 kB in 0s (12.4 MB/s) dpkg-name: info: moved 'libx11-data_2%3a1.8.12-1_all.deb' to '/srv/rebuilderd/tmp/tmpxo5xpefx/libx11-data_1.8.12-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5-dev riscv64 1.22.1-2 [16.2 kB] Fetched 16.2 kB in 0s (306 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpu4sqeiqq/libkrb5-dev_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 dh-fortran-mod all 0.57 [4716 B] Fetched 4716 B in 0s (89.5 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwrsasedq/dh-fortran-mod_0.57_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsuitesparseconfig7 riscv64 1:7.11.0+dfsg-2 [32.9 kB] Fetched 32.9 kB in 0s (612 kB/s) dpkg-name: info: moved 'libsuitesparseconfig7_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp3r7dft68/libsuitesparseconfig7_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libumfpack6 riscv64 1:7.11.0+dfsg-2 [260 kB] Fetched 260 kB in 0s (4411 kB/s) dpkg-name: info: moved 'libumfpack6_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpvxa0oth1/libumfpack6_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 mawk riscv64 1.3.4.20250131-1 [142 kB] Fetched 142 kB in 0s (6121 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj278f51_/mawk_1.3.4.20250131-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmp10 riscv64 2:6.3.0+dfsg-5 [558 kB] Fetched 558 kB in 0s (17.4 MB/s) dpkg-name: info: moved 'libgmp10_2%3a6.3.0+dfsg-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmp8v3elnh0/libgmp10_6.3.0+dfsg-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-runtime all 1.7.0-5 [249 kB] Fetched 249 kB in 0s (9699 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmllosumu/libpam-runtime_1.7.0-5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc-plugins riscv64 2.12.2-1 [17.8 kB] Fetched 17.8 kB in 0s (334 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpirivg3so/libhwloc-plugins_2.12.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-64i-dev riscv64 7.0.10-2 [19.3 kB] Fetched 19.3 kB in 0s (363 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4_zdlkiq/libscotch-64i-dev_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-mathjax all 2.7.9+dfsg-1 [5667 kB] Fetched 5667 kB in 0s (33.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvz9e5zrb/libjs-mathjax_2.7.9+dfsg-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcxsparse4 riscv64 1:7.11.0+dfsg-2 [89.9 kB] Fetched 89.9 kB in 0s (1645 kB/s) dpkg-name: info: moved 'libcxsparse4_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpbtu91yjz/libcxsparse4_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsmartcols1 riscv64 2.41.2-4 [155 kB] Fetched 155 kB in 0s (6617 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvxscysin/libsmartcols1_2.41.2-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 fonts-mathjax all 2.7.9+dfsg-1 [2210 kB] Fetched 2210 kB in 0s (22.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeyvae1ae/fonts-mathjax_2.7.9+dfsg-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc-complex3.24 riscv64 3.24.1+dfsg1-2 [6804 kB] Fetched 6804 kB in 5s (1472 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbhmyrx8n/libpetsc-complex3.24_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libbtf2 riscv64 1:7.11.0+dfsg-2 [33.5 kB] Fetched 33.5 kB in 0s (622 kB/s) dpkg-name: info: moved 'libbtf2_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpk0ogfodb/libbtf2_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libxdmcp6 riscv64 1:1.1.5-1 [28.0 kB] Fetched 28.0 kB in 0s (1298 kB/s) dpkg-name: info: moved 'libxdmcp6_1%3a1.1.5-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp47dzflnh/libxdmcp6_1.1.5-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-2.1-7t64 riscv64 2.1.12-stable-10+b1 [182 kB] Fetched 182 kB in 0s (3205 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd1ug0o4s/libevent-2.1-7t64_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 liblzma5 riscv64 5.8.1-2 [312 kB] Fetched 312 kB in 0s (11.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmspn7vfi/liblzma5_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-dev riscv64 0.3.30+ds-3 [43.5 kB] Fetched 43.5 kB in 0s (810 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6fpmmiib/libopenblas64-dev_0.3.30+ds-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 passwd riscv64 1:4.18.0-2 [1284 kB] Fetched 1284 kB in 0s (27.1 MB/s) dpkg-name: info: moved 'passwd_1%3a4.18.0-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpyftf8w23/passwd_4.18.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3-click all 8.2.0+0.really.8.1.8-1 [95.4 kB] Fetched 95.4 kB in 0s (1729 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzgixmmpw/python3-click_8.2.0+0.really.8.1.8-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++ riscv64 4:15.2.0-4 [1332 B] Fetched 1332 B in 0s (66.8 kB/s) dpkg-name: info: moved 'g++_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpwppg33fz/g++_15.2.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsemanage-common all 3.9-1 [7888 B] Fetched 7888 B in 0s (384 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqyczkmg5/libsemanage-common_3.9-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas3 riscv64 3.12.1-7 [123 kB] Fetched 123 kB in 0s (2208 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphcoxhg0m/libblas3_3.12.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxcb1-dev riscv64 1.17.0-2+b1 [241 kB] Fetched 241 kB in 0s (9493 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpx5x22a8h/libxcb1-dev_1.17.0-2+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgnutls-dane0t64 riscv64 3.8.10-3 [456 kB] Fetched 456 kB in 0s (7313 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpiydpdlwd/libgnutls-dane0t64_3.8.10-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 adduser all 3.153 [191 kB] Fetched 191 kB in 0s (7912 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpedm1pxgw/adduser_3.153_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 m4 riscv64 1.4.20-2 [323 kB] Fetched 323 kB in 0s (11.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplaz0s7ks/m4_1.4.20-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-core-2.1-7t64 riscv64 2.1.12-stable-10+b1 [132 kB] Fetched 132 kB in 0s (2359 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp51cw_gk8/libevent-core-2.1-7t64_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libselinux1 riscv64 3.9-2 [88.5 kB] Fetched 88.5 kB in 0s (1616 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpml153yhf/libselinux1_3.9-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sensible-utils all 0.0.26 [27.0 kB] Fetched 27.0 kB in 0s (1320 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5smpjiri/sensible-utils_0.0.26_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autoconf all 2.72-3.1 [494 kB] Fetched 494 kB in 0s (16.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptcq3w948/autoconf_2.72-3.1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libfuse3-4 riscv64 3.17.4-1 [97.8 kB] Fetched 97.8 kB in 0s (1761 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9y3c6m24/libfuse3-4_3.17.4-1_riscv64.deb' Downloading dependency 1 of 393: libitm1:riscv64=15.2.0-8 Downloading dependency 2 of 393: libgnutls-openssl27t64:riscv64=3.8.10-3 Downloading dependency 3 of 393: libmagic-mgc:riscv64=1:5.46-5 Downloading dependency 4 of 393: libgfortran-15-dev:riscv64=15.2.0-8 Downloading dependency 5 of 393: libxcb1:riscv64=1.17.0-2+b1 Downloading dependency 6 of 393: media-types:riscv64=14.0.0 Downloading dependency 7 of 393: zlib1g-dev:riscv64=1:1.3.dfsg+really1.3.1-1+b1 Downloading dependency 8 of 393: xtrans-dev:riscv64=1.6.0-1 Downloading dependency 9 of 393: libx11-dev:riscv64=2:1.8.12-1 Downloading dependency 10 of 393: libbsd0:riscv64=0.12.2-2 Downloading dependency 11 of 393: libmumps-dev:riscv64=5.8.1-2 Downloading dependency 12 of 393: libnuma-dev:riscv64=2.0.19-1 Downloading dependency 13 of 393: libibverbs-dev:riscv64=56.1-1+b1 Downloading dependency 14 of 393: libevent-pthreads-2.1-7t64:riscv64=2.1.12-stable-10+b1 Downloading dependency 15 of 393: libcrypt-dev:riscv64=1:4.5.1-1 Downloading dependency 16 of 393: libjansson4:riscv64=2.14-2+b3 Downloading dependency 17 of 393: sysvinit-utils:riscv64=3.15-6 Downloading dependency 18 of 393: perl:riscv64=5.40.1-7 Downloading dependency 19 of 393: libxml2-16:riscv64=2.15.1+dfsg-0.4 Downloading dependency 20 of 393: libcap2:riscv64=1:2.75-10+b1 Downloading dependency 21 of 393: libmount1:riscv64=2.41.2-4 Downloading dependency 22 of 393: dwz:riscv64=0.16-2 Downloading dependency 23 of 393: libblas-dev:riscv64=3.12.1-7 Downloading dependency 24 of 393: libsasl2-modules-db:riscv64=2.1.28+dfsg1-10 Downloading dependency 25 of 393: libexpat1:riscv64=2.7.3-1 Downloading dependency 26 of 393: libhogweed6t64:riscv64=3.10.2-1 Downloading dependency 27 of 393: libevent-openssl-2.1-7t64:riscv64=2.1.12-stable-10+b1 Downloading dependency 28 of 393: libtool:riscv64=2.5.4-7 Downloading dependency 29 of 393: libstdc++6:riscv64=15.2.0-8 Downloading dependency 30 of 393: gettext:riscv64=0.23.2-1 Downloading dependency 31 of 393: libldl3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 32 of 393: libfftw3-double3:riscv64=3.3.10-2+b1 Downloading dependency 33 of 393: libtinfo6:riscv64=6.5+20251115-2 Downloading dependency 34 of 393: libelf1t64:riscv64=0.194-1 Downloading dependency 35 of 393: libp11-kit-dev:riscv64=0.25.10-1 Downloading dependency 36 of 393: libhypre-2.33.0:riscv64=2.33.0-3 Downloading dependency 37 of 393: libcolamd3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 38 of 393: libacl1:riscv64=2.3.2-2+b1 Downloading dependency 39 of 393: libsuperlu-dist9:riscv64=9.2.0+dfsg1-4 Downloading dependency 40 of 393: libsuperlu7:riscv64=7.0.1+dfsg1-2 Downloading dependency 41 of 393: libpam-modules-bin:riscv64=1.7.0-5 Downloading dependency 42 of 393: libcbor0.10:riscv64=0.10.2-2.1 Downloading dependency 43 of 393: libzstd1:riscv64=1.5.7+dfsg-2 Downloading dependency 44 of 393: libmd0:riscv64=1.1.0-2+b1 Downloading dependency 45 of 393: libpython3-stdlib:riscv64=3.13.7-1 Downloading dependency 46 of 393: libssl-dev:riscv64=3.5.4-1 Downloading dependency 47 of 393: libfabric1:riscv64=2.1.0-1.1 Downloading dependency 48 of 393: libsuperlu-dev:riscv64=7.0.1+dfsg1-2 Downloading dependency 49 of 393: libasan8:riscv64=15.2.0-8 Downloading dependency 50 of 393: libxext6:riscv64=2:1.3.4-1+b3 Downloading dependency 51 of 393: gcc:riscv64=4:15.2.0-4 Downloading dependency 52 of 393: libfftw3-dev:riscv64=3.3.10-2+b1 Downloading dependency 53 of 393: dh-strip-nondeterminism:riscv64=1.15.0-1 Downloading dependency 54 of 393: libxnvctrl0:riscv64=535.171.04-1+b2 Downloading dependency 55 of 393: libngtcp2-dev:riscv64=1.16.0-1 Downloading dependency 56 of 393: rpcsvc-proto:riscv64=1.4.3-1+b2 Downloading dependency 57 of 393: libpmix2t64:riscv64=6.0.0+really5.0.9-2 Downloading dependency 58 of 393: libfile-libmagic-perl:riscv64=1.23-2+b2 Downloading dependency 59 of 393: zlib1g:riscv64=1:1.3.dfsg+really1.3.1-1+b1 Downloading dependency 60 of 393: g++-15-riscv64-linux-gnu:riscv64=15.2.0-8 Downloading dependency 61 of 393: libmumps64-dev:riscv64=5.8.1-2 Downloading dependency 62 of 393: libopenblas64-0:riscv64=0.3.30+ds-3 Downloading dependency 63 of 393: debianutils:riscv64=5.23.2 Downloading dependency 64 of 393: build-essential:riscv64=12.12 Downloading dependency 65 of 393: libamd3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 66 of 393: libmumps-headers-dev:riscv64=5.8.1-2 Downloading dependency 67 of 393: libperl5.40:riscv64=5.40.1-7 Downloading dependency 68 of 393: python3-minimal:riscv64=3.13.7-1 Downloading dependency 69 of 393: libidn2-0:riscv64=2.3.8-4 Downloading dependency 70 of 393: libhdf5-openmpi-fortran-310:riscv64=1.14.5+repack-4 Downloading dependency 71 of 393: opencl-clhpp-headers:riscv64=3.0~2025.07.22-1 Downloading dependency 72 of 393: libopenblas64-0-pthread:riscv64=0.3.30+ds-3 Downloading dependency 73 of 393: python3:riscv64=3.13.7-1 Downloading dependency 74 of 393: libbz2-1.0:riscv64=1.0.8-6 Downloading dependency 75 of 393: ncurses-base:riscv64=6.5+20250216-2 Downloading dependency 76 of 393: libhypre64-2.33.0:riscv64=2.33.0-3 Downloading dependency 77 of 393: libmetis5:riscv64=5.1.0.dfsg-8 Downloading dependency 78 of 393: libatomic1:riscv64=15.2.0-8 Downloading dependency 79 of 393: libparu1:riscv64=1:7.11.0+dfsg-2 Downloading dependency 80 of 393: gfortran:riscv64=4:15.2.0-4 Downloading dependency 81 of 393: libnghttp2-14:riscv64=1.64.0-1.1+b1 Downloading dependency 82 of 393: liblapack64-3:riscv64=3.12.1-7 Downloading dependency 83 of 393: intltool-debian:riscv64=0.35.0+20060710.6 Downloading dependency 84 of 393: libssh2-1-dev:riscv64=1.11.1-1 Downloading dependency 85 of 393: libgnutls30t64:riscv64=3.8.10-3 Downloading dependency 86 of 393: libx11-6:riscv64=2:1.8.12-1 Downloading dependency 87 of 393: dpkg-dev:riscv64=1.22.21 Downloading dependency 88 of 393: binutils:riscv64=2.45-8 Downloading dependency 89 of 393: libx11-data:riscv64=2:1.8.12-1 Downloading dependency 90 of 393: libkrb5-dev:riscv64=1.22.1-2 Downloading dependency 91 of 393: dh-fortran-mod:riscv64=0.57 Downloading dependency 92 of 393: libsuitesparseconfig7:riscv64=1:7.11.0+dfsg-2 Downloading dependency 93 of 393: libumfpack6:riscv64=1:7.11.0+dfsg-2 Downloading dependency 94 of 393: mawk:riscv64=1.3.4.20250131-1 Downloading dependency 95 of 393: libgmp10:riscv64=2:6.3.0+dfsg-5 Downloading dependency 96 of 393: libpam-runtime:riscv64=1.7.0-5 Downloading dependency 97 of 393: libhwloc-plugins:riscv64=2.12.2-1 Downloading dependency 98 of 393: libscotch-64i-dev:riscv64=7.0.10-2 Downloading dependency 99 of 393: libjs-mathjax:riscv64=2.7.9+dfsg-1 Downloading dependency 100 of 393: libcxsparse4:riscv64=1:7.11.0+dfsg-2 Downloading dependency 101 of 393: libsmartcols1:riscv64=2.41.2-4 Downloading dependency 102 of 393: fonts-mathjax:riscv64=2.7.9+dfsg-1 Downloading dependency 103 of 393: libpetsc-complex3.24:riscv64=3.24.1+dfsg1-2 Downloading dependency 104 of 393: libbtf2:riscv64=1:7.11.0+dfsg-2 Downloading dependency 105 of 393: libxdmcp6:riscv64=1:1.1.5-1 Downloading dependency 106 of 393: libevent-2.1-7t64:riscv64=2.1.12-stable-10+b1 Downloading dependency 107 of 393: liblzma5:riscv64=5.8.1-2 Downloading dependency 108 of 393: libopenblas64-dev:riscv64=0.3.30+ds-3 Downloading dependency 109 of 393: passwd:riscv64=1:4.18.0-2 Downloading dependency 110 of 393: python3-click:riscv64=8.2.0+0.really.8.1.8-1 Downloading dependency 111 of 393: g++:riscv64=4:15.2.0-4 Downloading dependency 112 of 393: libsemanage-common:riscv64=3.9-1 Downloading dependency 113 of 393: libblas3:riscv64=3.12.1-7 Downloading dependency 114 of 393: libxcb1-dev:riscv64=1.17.0-2+b1 Downloading dependency 115 of 393: libgnutls-dane0t64:riscv64=3.8.10-3 Downloading dependency 116 of 393: adduser:riscv64=3.153 Downloading dependency 117 of 393: m4:riscv64=1.4.20-2 Downloading dependency 118 of 393: libevent-core-2.1-7t64:riscv64=2.1.12-stable-10+b1 Downloading dependency 119 of 393: libselinux1:riscv64=3.9-2 Downloading dependency 120 of 393: sensible-utils:riscv64=0.0.26 Downloading dependency 121 of 393: autoconf:riscv64=2.72-3.1 Downloading dependency 122 of 393: libfuse3-4:riscv64=3.17.4-1 Downloading dependency 123 of 393: libpetsc64-complex3.24:riscv64=3.24.1+dfsg1-2Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc64-complex3.24 riscv64 3.24.1+dfsg1-2 [6765 kB] Fetched 6765 kB in 5s (1443 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp97xp5f0r/libpetsc64-complex3.24_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp3-dev riscv64 1.12.0-1 [203 kB] Fetched 203 kB in 0s (3521 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdfyh5075/libnghttp3-dev_1.12.0-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscalapack-mpi-dev riscv64 2.2.2-2 [6524 B] Fetched 6524 B in 0s (123 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk4igd7yo/libscalapack-mpi-dev_2.2.2-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-mpi-dev riscv64 3.3.10-2+b1 [77.5 kB] Fetched 77.5 kB in 0s (1419 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqj5o672x/libfftw3-mpi-dev_3.3.10-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkdb5-10t64 riscv64 1.22.1-2 [43.9 kB] Fetched 43.9 kB in 0s (819 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_1poqfnc/libkdb5-10t64_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 dpkg riscv64 1.22.21 [1542 kB] Fetched 1542 kB in 0s (28.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdi9iko0o/dpkg_1.22.21_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libbrotli-dev riscv64 1.1.0-2+b7 [798 kB] Fetched 798 kB in 0s (11.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3o9vqpgi/libbrotli-dev_1.1.0-2+b7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-long3 riscv64 3.3.10-2+b1 [705 kB] Fetched 705 kB in 0s (10.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0p00uti3/libfftw3-long3_3.3.10-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam0g riscv64 1.7.0-5 [70.2 kB] Fetched 70.2 kB in 0s (2674 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_n9kasnr/libpam0g_1.7.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp riscv64 4:15.2.0-4 [1572 B] Fetched 1572 B in 0s (77.8 kB/s) dpkg-name: info: moved 'cpp_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpfp6gck7c/cpp_15.2.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg-dev riscv64 1:2.1.5-4 [72.2 kB] Fetched 72.2 kB in 0s (1314 kB/s) dpkg-name: info: moved 'libjpeg-dev_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpspdf2lh2/libjpeg-dev_2.1.5-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 librbio4 riscv64 1:7.11.0+dfsg-2 [48.2 kB] Fetched 48.2 kB in 0s (896 kB/s) dpkg-name: info: moved 'librbio4_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp7524zzkt/librbio4_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 python3-magic all 2:0.4.27-3 [14.6 kB] Fetched 14.6 kB in 0s (276 kB/s) dpkg-name: info: moved 'python3-magic_2%3a0.4.27-3_all.deb' to '/srv/rebuilderd/tmp/tmpn2if26bv/python3-magic_0.4.27-3_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 gcc-15-base riscv64 15.2.0-8 [53.6 kB] Fetched 53.6 kB in 0s (605 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8vfwebal/gcc-15-base_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libp11-kit0 riscv64 0.25.10-1 [450 kB] Fetched 450 kB in 0s (15.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphaf1sd1c/libp11-kit0_0.25.10-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libdb5.3t64 riscv64 5.3.28+dfsg2-10 [719 kB] Fetched 719 kB in 0s (10.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpat7hfnln/libdb5.3t64_5.3.28+dfsg2-10_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenmpi40 riscv64 5.0.9-1 [2315 kB] Fetched 2315 kB in 0s (22.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpc2383brx/libopenmpi40_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 diffutils riscv64 1:3.12-1 [405 kB] Fetched 405 kB in 0s (13.9 MB/s) dpkg-name: info: moved 'diffutils_1%3a3.12-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpmkldqoei/diffutils_3.12-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dash riscv64 0.5.12-12 [101 kB] Fetched 101 kB in 0s (4233 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmperbgj9yy/dash_0.5.12-12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotcherr-dev riscv64 7.0.10-2 [11.5 kB] Fetched 11.5 kB in 0s (213 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbkfse58j/libscotcherr-dev_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libctf0 riscv64 2.45-8 [96.0 kB] Fetched 96.0 kB in 0s (1750 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp14gxb9t_/libctf0_2.45-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 base-files riscv64 14 [72.9 kB] Fetched 72.9 kB in 0s (3398 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8ru7m_kw/base-files_14_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack-dev riscv64 3.12.1-7 [12.0 MB] Fetched 12.0 MB in 0s (25.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpilzekgnj/liblapack-dev_3.12.1-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc3.24-dev-common all 3.24.1+dfsg1-2 [315 kB] Fetched 315 kB in 0s (5278 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpctjdojsx/libpetsc3.24-dev-common_3.24.1+dfsg1-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpkgconf3 riscv64 1.8.1-4 [36.0 kB] Fetched 36.0 kB in 0s (1694 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplczkrye7/libpkgconf3_1.8.1-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libnl-3-200 riscv64 3.11.0-2 [61.1 kB] Fetched 61.1 kB in 0s (1124 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfevgmwba/libnl-3-200_3.11.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibverbs1 riscv64 56.1-1+b1 [62.9 kB] Fetched 62.9 kB in 0s (1156 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpw69p4ma6/libibverbs1_56.1-1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 util-linux riscv64 2.41.2-4 [1158 kB] Fetched 1158 kB in 0s (14.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1ulj3gio/util-linux_2.41.2-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libbinutils riscv64 2.45-8 [521 kB] Fetched 521 kB in 0s (8120 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpv8nfb4ds/libbinutils_2.45-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libucx0 riscv64 1.19.0+ds-1+b1 [1205 kB] Fetched 1205 kB in 0s (14.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbv1kxbe_/libucx0_1.19.0+ds-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgssrpc4t64 riscv64 1.22.1-2 [60.8 kB] Fetched 60.8 kB in 0s (1101 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3zsxlt_4/libgssrpc4t64_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgdbm6t64 riscv64 1.26-1 [79.0 kB] Fetched 79.0 kB in 0s (3607 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph224zxus/libgdbm6t64_1.26-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 grep riscv64 3.12-1 [442 kB] Fetched 442 kB in 0s (15.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp2wbj_j2w/grep_3.12-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libuchardet0 riscv64 0.0.8-2 [68.3 kB] Fetched 68.3 kB in 0s (3194 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcy679cw3/libuchardet0_0.0.8-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libdebhelper-perl all 13.28 [92.4 kB] Fetched 92.4 kB in 0s (4057 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpstr722_l/libdebhelper-perl_13.28_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsystemd0 riscv64 259~rc1-1 [479 kB] Fetched 479 kB in 0s (7548 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpi2eazcq3/libsystemd0_259~rc1-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsuitesparse-mongoose3 riscv64 1:7.11.0+dfsg-2 [57.2 kB] Fetched 57.2 kB in 0s (1057 kB/s) dpkg-name: info: moved 'libsuitesparse-mongoose3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpg_rdrx0k/libsuitesparse-mongoose3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcurl4t64 riscv64 8.17.0-2 [414 kB] Fetched 414 kB in 0s (6671 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxo0kwsdo/libcurl4t64_8.17.0-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 python3.13-minimal riscv64 3.13.9-1 [2169 kB] Fetched 2169 kB in 0s (33.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsct91hqc/python3.13-minimal_3.13.9-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 binutils-riscv64-linux-gnu riscv64 2.45-8 [891 kB] Fetched 891 kB in 0s (12.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_dn58t77/binutils-riscv64-linux-gnu_2.45-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 gcc-15 riscv64 15.2.0-8 [516 kB] Fetched 516 kB in 0s (8063 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy6heu3w1/gcc-15_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcamd3 riscv64 1:7.11.0+dfsg-2 [45.7 kB] Fetched 45.7 kB in 0s (841 kB/s) dpkg-name: info: moved 'libcamd3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpsrvxt9xk/libcamd3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc15 riscv64 2.12.2-1 [158 kB] Fetched 158 kB in 0s (2803 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpi19p965a/libhwloc15_2.12.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libffi8 riscv64 3.5.2-2 [21.8 kB] Fetched 21.8 kB in 0s (1058 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphecf87ux/libffi8_3.5.2-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 xorg-sgml-doctools all 1:1.11-1.1 [22.1 kB] Fetched 22.1 kB in 0s (1066 kB/s) dpkg-name: info: moved 'xorg-sgml-doctools_1%3a1.11-1.1_all.deb' to '/srv/rebuilderd/tmp/tmpy5q7041y/xorg-sgml-doctools_1.11-1.1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-cpp-310 riscv64 1.14.5+repack-4 [24.4 kB] Fetched 24.4 kB in 0s (457 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfmdk27x2/libhdf5-openmpi-hl-cpp-310_1.14.5+repack-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libpython3.13-stdlib riscv64 3.13.9-1 [1928 kB] Fetched 1928 kB in 0s (19.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpbulkm8v3/libpython3.13-stdlib_3.13.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-crypto-ossl0 riscv64 1.16.0-1 [27.7 kB] Fetched 27.7 kB in 0s (1338 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvf9ute3h/libngtcp2-crypto-ossl0_1.16.0-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscalapack-openmpi-dev riscv64 2.2.2-2 [11.4 kB] Fetched 11.4 kB in 0s (215 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnbot6vhn/libscalapack-openmpi-dev_2.2.2-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libltdl7 riscv64 2.5.4-7 [415 kB] Fetched 415 kB in 0s (6722 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphqvw39ll/libltdl7_2.5.4-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcrypt1 riscv64 1:4.5.1-1 [114 kB] Fetched 114 kB in 0s (5052 kB/s) dpkg-name: info: moved 'libcrypt1_1%3a4.5.1-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmptfoq9hah/libcrypt1_4.5.1-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 python3.13 riscv64 3.13.9-1 [764 kB] Fetched 764 kB in 0s (20.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp48cbfe1g/python3.13_3.13.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 make riscv64 4.4.1-3 [463 kB] Fetched 463 kB in 0s (15.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpejm4o41n/make_4.4.1-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libblkid1 riscv64 2.41.2-4 [193 kB] Fetched 193 kB in 0s (7950 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjpfc7tqf/libblkid1_2.41.2-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfile-stripnondeterminism-perl all 1.15.0-1 [19.9 kB] Fetched 19.9 kB in 0s (967 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpapnising/libfile-stripnondeterminism-perl_1.15.0-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libpython3.13-minimal riscv64 3.13.9-1 [861 kB] Fetched 861 kB in 0s (22.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppltrapv7/libpython3.13-minimal_3.13.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssl3t64 riscv64 3.5.4-1 [2206 kB] Fetched 2206 kB in 0s (21.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcw0uqqt4/libssl3t64_3.5.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-dev riscv64 1.14.5+repack-4 [7582 kB] Fetched 7582 kB in 0s (24.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmsdnw50z/libhdf5-openmpi-dev_1.14.5+repack-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libxau6 riscv64 1:1.0.11-1 [20.5 kB] Fetched 20.5 kB in 0s (981 kB/s) dpkg-name: info: moved 'libxau6_1%3a1.0.11-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpvup92pdl/libxau6_1.0.11-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libedit2 riscv64 3.1-20250104-1 [92.5 kB] Fetched 92.5 kB in 0s (4216 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpe6w0yyl9/libedit2_3.1-20250104-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libaudit-common all 1:4.1.2-1 [14.3 kB] Fetched 14.3 kB in 0s (691 kB/s) dpkg-name: info: moved 'libaudit-common_1%3a4.1.2-1_all.deb' to '/srv/rebuilderd/tmp/tmpu6ekwjkw/libaudit-common_4.1.2-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-64i-dev riscv64 7.0.10-2 [16.7 kB] Fetched 16.7 kB in 0s (316 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjlxcqrxk/libptscotch-64i-dev_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libldap-dev riscv64 2.6.10+dfsg-1 [584 kB] Fetched 584 kB in 0s (8920 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0nmfkxcc/libldap-dev_2.6.10+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkadm5srv-mit12 riscv64 1.22.1-2 [55.8 kB] Fetched 55.8 kB in 0s (1032 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_xbz0839/libkadm5srv-mit12_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ibverbs-providers riscv64 56.1-1+b1 [382 kB] Fetched 382 kB in 0s (6252 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa5o1om7c/ibverbs-providers_56.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5-3 riscv64 1.22.1-2 [346 kB] Fetched 346 kB in 0s (12.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp42ys7gox/libkrb5-3_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmpc3 riscv64 1.3.1-2 [56.4 kB] Fetched 56.4 kB in 0s (2575 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoz9q0jb9/libmpc3_1.3.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libc6-dev riscv64 2.41-12 [3127 kB] Fetched 3127 kB in 0s (25.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa5w4lowz/libc6-dev_2.41-12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libnl-3-dev riscv64 3.11.0-2 [172 kB] Fetched 172 kB in 0s (3039 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk_wlem_s/libnl-3-dev_3.11.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-16 riscv64 1.16.0-1 [141 kB] Fetched 141 kB in 0s (6078 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaf23wbn5/libngtcp2-16_1.16.0-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-7.0c riscv64 7.0.10-2 [167 kB] Fetched 167 kB in 0s (2954 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoyyvwc2g/libptscotch-7.0c_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openmpi-common all 5.0.9-1 [97.6 kB] Fetched 97.6 kB in 0s (1771 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdclhb9l5/openmpi-common_5.0.9-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-extra-2.1-7t64 riscv64 2.1.12-stable-10+b1 [108 kB] Fetched 108 kB in 0s (1599 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnk4vkotn/libevent-extra-2.1-7t64_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibmad5 riscv64 56.1-1+b1 [44.9 kB] Fetched 44.9 kB in 0s (836 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0v9ly4_p/libibmad5_56.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpam-modules riscv64 1.7.0-5 [177 kB] Fetched 177 kB in 0s (7405 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpp62wdxeg/libpam-modules_1.7.0-5_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libnl-route-3-dev riscv64 3.11.0-2 [534 kB] Fetched 534 kB in 0s (8299 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkjd674wj/libnl-route-3-dev_3.11.0-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc64-real3.24 riscv64 3.24.1+dfsg1-2 [6687 kB] Fetched 6687 kB in 7s (1022 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpf_x0yrce/libpetsc64-real3.24_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg62-turbo riscv64 1:2.1.5-4 [155 kB] Fetched 155 kB in 0s (6564 kB/s) dpkg-name: info: moved 'libjpeg62-turbo_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmph2zyshq4/libjpeg62-turbo_2.1.5-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librtmp1 riscv64 2.4+20151223.gitfa8646d.1-3 [58.7 kB] Fetched 58.7 kB in 0s (2761 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp3y_c95ce/librtmp1_2.4+20151223.gitfa8646d.1-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfortran-toml-0 riscv64 0.4.3-1 [85.4 kB] Fetched 85.4 kB in 0s (1218 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkucocoqx/libfortran-toml-0_0.4.3-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-64i-7.0 riscv64 7.0.10-2 [253 kB] Fetched 253 kB in 0s (4325 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpi3ud9s0a/libscotch-64i-7.0_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 xz-utils riscv64 5.8.1-2 [659 kB] Fetched 659 kB in 0s (19.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp20czqhzu/xz-utils_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhwloc-dev riscv64 2.12.2-1 [520 kB] Fetched 520 kB in 0s (8101 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpfct0n13u/libhwloc-dev_2.12.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-64-7.0 riscv64 7.0.10-2 [153 kB] Fetched 153 kB in 2s (72.2 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpinx3amat/libptscotch-64-7.0_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libparpack2t64 riscv64 3.9.1-6 [86.5 kB] Fetched 86.5 kB in 0s (1578 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkqjuddun/libparpack2t64_3.9.1-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkrb5support0 riscv64 1.22.1-2 [33.7 kB] Fetched 33.7 kB in 0s (1620 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg6s_p8on/libkrb5support0_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtasn1-6 riscv64 4.20.0-2 [50.6 kB] Fetched 50.6 kB in 0s (2382 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpojb8whx4/libtasn1-6_4.20.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 krb5-multidev riscv64 1.22.1-2 [127 kB] Fetched 127 kB in 0s (2271 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpawz0uqzb/krb5-multidev_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 liblsan0 riscv64 15.2.0-8 [1326 kB] Fetched 1326 kB in 0s (15.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4lhy87nq/liblsan0_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcap-ng0 riscv64 0.8.5-4+b1 [17.2 kB] Fetched 17.2 kB in 0s (840 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmps7a5decr/libcap-ng0_0.8.5-4+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 sed riscv64 4.9-2 [329 kB] Fetched 329 kB in 0s (12.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmphacezz1q/sed_4.9-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libc-bin riscv64 2.41-12 [606 kB] Fetched 606 kB in 0s (18.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph41rha1x/libc-bin_2.41-12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcurl4-openssl-dev riscv64 8.17.0-2 [1316 kB] Fetched 1316 kB in 0s (14.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgf6gawq4/libcurl4-openssl-dev_8.17.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 readline-common all 8.3-3 [74.8 kB] Fetched 74.8 kB in 0s (1368 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk65ae3u4/readline-common_8.3-3_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 base-passwd riscv64 3.6.8 [54.8 kB] Fetched 54.8 kB in 0s (2595 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5c719fnr/base-passwd_3.6.8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcom-err2 riscv64 1.47.2-3+b3 [24.7 kB] Fetched 24.7 kB in 0s (464 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmwy8og_e/libcom-err2_1.47.2-3+b3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmunge2 riscv64 0.5.16-1 [19.9 kB] Fetched 19.9 kB in 0s (376 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0jznatvs/libmunge2_0.5.16-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 ncurses-bin riscv64 6.5+20250216-2 [436 kB] Fetched 436 kB in 0s (6987 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpv68xsqf8/ncurses-bin_6.5+20250216-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 pkgconf-bin riscv64 1.8.1-4 [29.8 kB] Fetched 29.8 kB in 0s (1435 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzjrxci_0/pkgconf-bin_1.8.1-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libunistring5 riscv64 1.3-2 [474 kB] Fetched 474 kB in 0s (15.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk9z9j1sv/libunistring5_1.3-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librdmacm1t64 riscv64 56.1-1+b1 [72.7 kB] Fetched 72.7 kB in 0s (1337 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpojcvu5mi/librdmacm1t64_56.1-1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libuuid1 riscv64 2.41.2-4 [40.0 kB] Fetched 40.0 kB in 0s (1923 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwm8enmjp/libuuid1_2.41.2-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl-base riscv64 5.40.1-7 [1678 kB] Fetched 1678 kB in 0s (30.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl9jjudp3/perl-base_5.40.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkadm5clnt-mit12 riscv64 1.22.1-2 [42.2 kB] Fetched 42.2 kB in 0s (785 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmo7mtnpd/libkadm5clnt-mit12_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libibumad3 riscv64 56.1-1+b1 [29.9 kB] Fetched 29.9 kB in 0s (559 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxsazr5y4/libibumad3_56.1-1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-5.8 riscv64 5.8.1-2 [1838 kB] Fetched 1838 kB in 0s (19.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcesjw40f/libmumps-5.8_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmpfr6 riscv64 4.2.2-2 [664 kB] Fetched 664 kB in 0s (19.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpg66g73gc/libmpfr6_4.2.2-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libltdl-dev riscv64 2.5.4-7 [193 kB] Fetched 193 kB in 0s (3339 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmprxc93ywz/libltdl-dev_2.5.4-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 man-db riscv64 2.13.1-1 [1458 kB] Fetched 1458 kB in 0s (28.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl0xv7c4c/man-db_2.13.1-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotcherr-dev riscv64 7.0.10-2 [11.5 kB] Fetched 11.5 kB in 0s (218 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqb6j1qgr/libptscotcherr-dev_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 patchelf riscv64 0.18.0-1.4 [103 kB] Fetched 103 kB in 0s (1856 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjm1_w8od/patchelf_0.18.0-1.4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc3.24-dev-examples all 3.24.1+dfsg1-2 [3568 kB] Fetched 3568 kB in 4s (849 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0vsh5auu/libpetsc3.24-dev-examples_3.24.1+dfsg1-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libkeyutils1 riscv64 1.6.3-6 [9480 B] Fetched 9480 B in 0s (471 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo2kmpe6o/libkeyutils1_1.6.3-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsz2 riscv64 1.1.4-2 [7900 B] Fetched 7900 B in 0s (149 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmputmquivw/libsz2_1.1.4-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libtasn1-6-dev riscv64 4.20.0-2 [149 kB] Fetched 149 kB in 0s (2651 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq9e61ohq/libtasn1-6-dev_4.20.0-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-64-dev riscv64 7.0.10-2 [16.8 kB] Fetched 16.8 kB in 2s (7712 B/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgjzr5lvk/libptscotch-64-dev_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc-real3.24 riscv64 3.24.1+dfsg1-2 [6750 kB] Fetched 6750 kB in 0s (24.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyrbdoexw/libpetsc-real3.24_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libspqr4 riscv64 1:7.11.0+dfsg-2 [139 kB] Fetched 139 kB in 0s (2465 kB/s) dpkg-name: info: moved 'libspqr4_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpodiqy3_d/libspqr4_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-jquery-ui all 1.13.2+dfsg-1 [250 kB] Fetched 250 kB in 0s (4286 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzkp4r75j/libjs-jquery-ui_1.13.2+dfsg-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 mpi-default-bin riscv64 1.19 [2644 B] Fetched 2644 B in 0s (50.3 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp62b3qm9m/mpi-default-bin_1.19_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 cpp-15 riscv64 15.2.0-8 [1272 B] Fetched 1272 B in 0s (24.3 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvv27tup6/cpp-15_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libaec0 riscv64 1.1.4-2 [24.1 kB] Fetched 24.1 kB in 0s (451 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpj2l08afy/libaec0_1.1.4-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gcc-riscv64-linux-gnu riscv64 4:15.2.0-4 [1432 B] Fetched 1432 B in 0s (71.5 kB/s) dpkg-name: info: moved 'gcc-riscv64-linux-gnu_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmp3oav51m0/gcc-riscv64-linux-gnu_15.2.0-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsqlite3-0 riscv64 3.46.1-8 [910 kB] Fetched 910 kB in 0s (23.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1fnvh104/libsqlite3-0_3.46.1-8_riscv64.deb' Downloading dependency 124 of 393: libnghttp3-dev:riscv64=1.12.0-1 Downloading dependency 125 of 393: libscalapack-mpi-dev:riscv64=2.2.2-2 Downloading dependency 126 of 393: libfftw3-mpi-dev:riscv64=3.3.10-2+b1 Downloading dependency 127 of 393: libkdb5-10t64:riscv64=1.22.1-2 Downloading dependency 128 of 393: dpkg:riscv64=1.22.21 Downloading dependency 129 of 393: libbrotli-dev:riscv64=1.1.0-2+b7 Downloading dependency 130 of 393: libfftw3-long3:riscv64=3.3.10-2+b1 Downloading dependency 131 of 393: libpam0g:riscv64=1.7.0-5 Downloading dependency 132 of 393: cpp:riscv64=4:15.2.0-4 Downloading dependency 133 of 393: libjpeg-dev:riscv64=1:2.1.5-4 Downloading dependency 134 of 393: librbio4:riscv64=1:7.11.0+dfsg-2 Downloading dependency 135 of 393: python3-magic:riscv64=2:0.4.27-3 Downloading dependency 136 of 393: gcc-15-base:riscv64=15.2.0-8 Downloading dependency 137 of 393: libp11-kit0:riscv64=0.25.10-1 Downloading dependency 138 of 393: libdb5.3t64:riscv64=5.3.28+dfsg2-10 Downloading dependency 139 of 393: libopenmpi40:riscv64=5.0.9-1 Downloading dependency 140 of 393: diffutils:riscv64=1:3.12-1 Downloading dependency 141 of 393: dash:riscv64=0.5.12-12 Downloading dependency 142 of 393: libscotcherr-dev:riscv64=7.0.10-2 Downloading dependency 143 of 393: libctf0:riscv64=2.45-8 Downloading dependency 144 of 393: base-files:riscv64=14 Downloading dependency 145 of 393: liblapack-dev:riscv64=3.12.1-7 Downloading dependency 146 of 393: libpetsc3.24-dev-common:riscv64=3.24.1+dfsg1-2 Downloading dependency 147 of 393: libpkgconf3:riscv64=1.8.1-4 Downloading dependency 148 of 393: libnl-3-200:riscv64=3.11.0-2 Downloading dependency 149 of 393: libibverbs1:riscv64=56.1-1+b1 Downloading dependency 150 of 393: util-linux:riscv64=2.41.2-4 Downloading dependency 151 of 393: libbinutils:riscv64=2.45-8 Downloading dependency 152 of 393: libucx0:riscv64=1.19.0+ds-1+b1 Downloading dependency 153 of 393: libgssrpc4t64:riscv64=1.22.1-2 Downloading dependency 154 of 393: libgdbm6t64:riscv64=1.26-1 Downloading dependency 155 of 393: grep:riscv64=3.12-1 Downloading dependency 156 of 393: libuchardet0:riscv64=0.0.8-2 Downloading dependency 157 of 393: libdebhelper-perl:riscv64=13.28 Downloading dependency 158 of 393: libsystemd0:riscv64=259~rc1-1 Downloading dependency 159 of 393: libsuitesparse-mongoose3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 160 of 393: libcurl4t64:riscv64=8.17.0-2 Downloading dependency 161 of 393: python3.13-minimal:riscv64=3.13.9-1 Downloading dependency 162 of 393: binutils-riscv64-linux-gnu:riscv64=2.45-8 Downloading dependency 163 of 393: gcc-15:riscv64=15.2.0-8 Downloading dependency 164 of 393: libcamd3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 165 of 393: libhwloc15:riscv64=2.12.2-1 Downloading dependency 166 of 393: libffi8:riscv64=3.5.2-2 Downloading dependency 167 of 393: xorg-sgml-doctools:riscv64=1:1.11-1.1 Downloading dependency 168 of 393: libhdf5-openmpi-hl-cpp-310:riscv64=1.14.5+repack-4 Downloading dependency 169 of 393: libpython3.13-stdlib:riscv64=3.13.9-1 Downloading dependency 170 of 393: libngtcp2-crypto-ossl0:riscv64=1.16.0-1 Downloading dependency 171 of 393: libscalapack-openmpi-dev:riscv64=2.2.2-2 Downloading dependency 172 of 393: libltdl7:riscv64=2.5.4-7 Downloading dependency 173 of 393: libcrypt1:riscv64=1:4.5.1-1 Downloading dependency 174 of 393: python3.13:riscv64=3.13.9-1 Downloading dependency 175 of 393: make:riscv64=4.4.1-3 Downloading dependency 176 of 393: libblkid1:riscv64=2.41.2-4 Downloading dependency 177 of 393: libfile-stripnondeterminism-perl:riscv64=1.15.0-1 Downloading dependency 178 of 393: libpython3.13-minimal:riscv64=3.13.9-1 Downloading dependency 179 of 393: libssl3t64:riscv64=3.5.4-1 Downloading dependency 180 of 393: libhdf5-openmpi-dev:riscv64=1.14.5+repack-4 Downloading dependency 181 of 393: libxau6:riscv64=1:1.0.11-1 Downloading dependency 182 of 393: libedit2:riscv64=3.1-20250104-1 Downloading dependency 183 of 393: libaudit-common:riscv64=1:4.1.2-1 Downloading dependency 184 of 393: libptscotch-64i-dev:riscv64=7.0.10-2 Downloading dependency 185 of 393: libldap-dev:riscv64=2.6.10+dfsg-1 Downloading dependency 186 of 393: libkadm5srv-mit12:riscv64=1.22.1-2 Downloading dependency 187 of 393: ibverbs-providers:riscv64=56.1-1+b1 Downloading dependency 188 of 393: libkrb5-3:riscv64=1.22.1-2 Downloading dependency 189 of 393: libmpc3:riscv64=1.3.1-2 Downloading dependency 190 of 393: libc6-dev:riscv64=2.41-12 Downloading dependency 191 of 393: libnl-3-dev:riscv64=3.11.0-2 Downloading dependency 192 of 393: libngtcp2-16:riscv64=1.16.0-1 Downloading dependency 193 of 393: libptscotch-7.0c:riscv64=7.0.10-2 Downloading dependency 194 of 393: openmpi-common:riscv64=5.0.9-1 Downloading dependency 195 of 393: libevent-extra-2.1-7t64:riscv64=2.1.12-stable-10+b1 Downloading dependency 196 of 393: libibmad5:riscv64=56.1-1+b1 Downloading dependency 197 of 393: libpam-modules:riscv64=1.7.0-5 Downloading dependency 198 of 393: libnl-route-3-dev:riscv64=3.11.0-2 Downloading dependency 199 of 393: libpetsc64-real3.24:riscv64=3.24.1+dfsg1-2 Downloading dependency 200 of 393: libjpeg62-turbo:riscv64=1:2.1.5-4 Downloading dependency 201 of 393: librtmp1:riscv64=2.4+20151223.gitfa8646d.1-3 Downloading dependency 202 of 393: libfortran-toml-0:riscv64=0.4.3-1 Downloading dependency 203 of 393: libscotch-64i-7.0:riscv64=7.0.10-2 Downloading dependency 204 of 393: xz-utils:riscv64=5.8.1-2 Downloading dependency 205 of 393: libhwloc-dev:riscv64=2.12.2-1 Downloading dependency 206 of 393: libptscotch-64-7.0:riscv64=7.0.10-2 Downloading dependency 207 of 393: libparpack2t64:riscv64=3.9.1-6 Downloading dependency 208 of 393: libkrb5support0:riscv64=1.22.1-2 Downloading dependency 209 of 393: libtasn1-6:riscv64=4.20.0-2 Downloading dependency 210 of 393: krb5-multidev:riscv64=1.22.1-2 Downloading dependency 211 of 393: liblsan0:riscv64=15.2.0-8 Downloading dependency 212 of 393: libcap-ng0:riscv64=0.8.5-4+b1 Downloading dependency 213 of 393: sed:riscv64=4.9-2 Downloading dependency 214 of 393: libc-bin:riscv64=2.41-12 Downloading dependency 215 of 393: libcurl4-openssl-dev:riscv64=8.17.0-2 Downloading dependency 216 of 393: readline-common:riscv64=8.3-3 Downloading dependency 217 of 393: base-passwd:riscv64=3.6.8 Downloading dependency 218 of 393: libcom-err2:riscv64=1.47.2-3+b3 Downloading dependency 219 of 393: libmunge2:riscv64=0.5.16-1 Downloading dependency 220 of 393: ncurses-bin:riscv64=6.5+20250216-2 Downloading dependency 221 of 393: pkgconf-bin:riscv64=1.8.1-4 Downloading dependency 222 of 393: libunistring5:riscv64=1.3-2 Downloading dependency 223 of 393: librdmacm1t64:riscv64=56.1-1+b1 Downloading dependency 224 of 393: libuuid1:riscv64=2.41.2-4 Downloading dependency 225 of 393: perl-base:riscv64=5.40.1-7 Downloading dependency 226 of 393: libkadm5clnt-mit12:riscv64=1.22.1-2 Downloading dependency 227 of 393: libibumad3:riscv64=56.1-1+b1 Downloading dependency 228 of 393: libmumps-5.8:riscv64=5.8.1-2 Downloading dependency 229 of 393: libmpfr6:riscv64=4.2.2-2 Downloading dependency 230 of 393: libltdl-dev:riscv64=2.5.4-7 Downloading dependency 231 of 393: man-db:riscv64=2.13.1-1 Downloading dependency 232 of 393: libptscotcherr-dev:riscv64=7.0.10-2 Downloading dependency 233 of 393: patchelf:riscv64=0.18.0-1.4 Downloading dependency 234 of 393: libpetsc3.24-dev-examples:riscv64=3.24.1+dfsg1-2 Downloading dependency 235 of 393: libkeyutils1:riscv64=1.6.3-6 Downloading dependency 236 of 393: libsz2:riscv64=1.1.4-2 Downloading dependency 237 of 393: libtasn1-6-dev:riscv64=4.20.0-2 Downloading dependency 238 of 393: libptscotch-64-dev:riscv64=7.0.10-2 Downloading dependency 239 of 393: libpetsc-real3.24:riscv64=3.24.1+dfsg1-2 Downloading dependency 240 of 393: libspqr4:riscv64=1:7.11.0+dfsg-2 Downloading dependency 241 of 393: libjs-jquery-ui:riscv64=1.13.2+dfsg-1 Downloading dependency 242 of 393: mpi-default-bin:riscv64=1.19 Downloading dependency 243 of 393: cpp-15:riscv64=15.2.0-8 Downloading dependency 244 of 393: libaec0:riscv64=1.1.4-2 Downloading dependency 245 of 393: gcc-riscv64-linux-gnu:riscv64=4:15.2.0-4 Downloading dependency 246 of 393: libsqlite3-0:riscv64=3.46.1-8 Downloading dependency 247 of 393: libxau-dev:riscv64=1:1.0.11-1Get:1 http://deb.debian.org/debian unstable/main riscv64 libxau-dev riscv64 1:1.0.11-1 [27.6 kB] Fetched 27.6 kB in 0s (1354 kB/s) dpkg-name: info: moved 'libxau-dev_1%3a1.0.11-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpxl_1kqp3/libxau-dev_1.0.11-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp2-dev riscv64 1.64.0-1.1+b1 [220 kB] Fetched 220 kB in 0s (8141 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0bqmv16j/libnghttp2-dev_1.64.0-1.1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc-complex3.24-dev riscv64 3.24.1+dfsg1-2 [13.5 MB] Fetched 13.5 MB in 5s (2624 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp22j6fl_9/libpetsc-complex3.24-dev_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libyaml-dev riscv64 0.2.5-2 [138 kB] Fetched 138 kB in 0s (2433 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt97f_em9/libyaml-dev_0.2.5-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 dh-fortran riscv64 0.57 [35.6 kB] Fetched 35.6 kB in 0s (661 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpi2_pioj0/dh-fortran_0.57_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgprofng0 riscv64 2.45-8 [716 kB] Fetched 716 kB in 0s (10.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpuojc0nse/libgprofng0_2.45-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscalapack-openmpi2.2 riscv64 2.2.2-2 [1363 kB] Fetched 1363 kB in 0s (16.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpiszx_tpn/libscalapack-openmpi2.2_2.2.2-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotcherr-7.0 riscv64 7.0.10-2 [12.1 kB] Fetched 12.1 kB in 0s (230 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxaphwf0b/libscotcherr-7.0_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarpack2t64 riscv64 3.9.1-6 [93.3 kB] Fetched 93.3 kB in 0s (1700 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpwy0sd3dk/libarpack2t64_3.9.1-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 init-system-helpers all 1.69 [39.3 kB] Fetched 39.3 kB in 0s (1899 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxtk5nsrp/init-system-helpers_1.69_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libzstd-dev riscv64 1.5.7+dfsg-2 [1618 kB] Fetched 1618 kB in 0s (17.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp806pmirf/libzstd-dev_1.5.7+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmpxx4ldbl riscv64 2:6.3.0+dfsg-5 [329 kB] Fetched 329 kB in 0s (12.0 MB/s) dpkg-name: info: moved 'libgmpxx4ldbl_2%3a6.3.0+dfsg-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmppmgw7j8c/libgmpxx4ldbl_6.3.0+dfsg-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libevent-dev riscv64 2.1.12-stable-10+b1 [523 kB] Fetched 523 kB in 0s (8180 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzqop5zzg/libevent-dev_2.1.12-stable-10+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 automake all 1:1.18.1-3 [878 kB] Fetched 878 kB in 0s (22.4 MB/s) dpkg-name: info: moved 'automake_1%3a1.18.1-3_all.deb' to '/srv/rebuilderd/tmp/tmpac114o8x/automake_1.18.1-3_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libpipeline1 riscv64 1.5.8-1 [40.3 kB] Fetched 40.3 kB in 0s (750 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1vg0fs_a/libpipeline1_1.5.8-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-dev riscv64 7.0.10-2 [846 kB] Fetched 846 kB in 0s (11.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpx4f7ewlf/libptscotch-dev_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libblas64-3 riscv64 3.12.1-7 [116 kB] Fetched 116 kB in 0s (2100 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmph_0rkpcs/libblas64-3_3.12.1-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libaudit1 riscv64 1:4.1.2-1 [58.2 kB] Fetched 58.2 kB in 0s (2782 kB/s) dpkg-name: info: moved 'libaudit1_1%3a4.1.2-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmp3_hkto46/libaudit1_4.1.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gettext-base riscv64 0.23.2-1 [244 kB] Fetched 244 kB in 0s (9606 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppvax_jma/gettext-base_0.23.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc64-complex3.24-dev riscv64 3.24.1+dfsg1-2 [13.6 MB] Fetched 13.6 MB in 5s (2717 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpha71m6u4/libpetsc64-complex3.24-dev_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 po-debconf all 1.0.21+nmu1 [248 kB] Fetched 248 kB in 0s (9824 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmps78myktd/po-debconf_1.0.21+nmu1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libstdc++-15-dev riscv64 15.2.0-8 [6169 kB] Fetched 6169 kB in 0s (30.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7xctq67w/libstdc++-15-dev_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 perl-modules-5.40 all 5.40.1-7 [3012 kB] Fetched 3012 kB in 0s (25.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp4raa7i7w/perl-modules-5.40_5.40.1-7_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ocl-icd-libopencl1 riscv64 2.3.4-1 [42.3 kB] Fetched 42.3 kB in 0s (789 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpeniap8xs/ocl-icd-libopencl1_2.3.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 opencl-c-headers all 3.0~2025.07.22-2 [47.6 kB] Fetched 47.6 kB in 0s (884 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdvzbz0wi/opencl-c-headers_3.0~2025.07.22-2_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libspex3 riscv64 1:7.11.0+dfsg-2 [68.8 kB] Fetched 68.8 kB in 0s (1269 kB/s) dpkg-name: info: moved 'libspex3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpg7a3xx01/libspex3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcc1-0 riscv64 15.2.0-8 [40.2 kB] Fetched 40.2 kB in 0s (752 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppdqrgnhp/libcc1-0_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfortran-jonquil-0 riscv64 0.3.0-3 [19.3 kB] Fetched 19.3 kB in 0s (360 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjkzjvjo8/libfortran-jonquil-0_0.3.0-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 pkgconf riscv64 1.8.1-4 [26.1 kB] Fetched 26.1 kB in 0s (1267 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp830p4ibx/pkgconf_1.8.1-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsframe2 riscv64 2.45-8 [80.8 kB] Fetched 80.8 kB in 0s (1476 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpe9m9ompx/libsframe2_2.45-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 bsdextrautils riscv64 2.41.2-4 [102 kB] Fetched 102 kB in 0s (4620 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa3t2192k/bsdextrautils_2.41.2-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 linux-libc-dev all 6.17.8-1 [2552 kB] Fetched 2552 kB in 0s (22.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpua1y8lf4/linux-libc-dev_6.17.8-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openmpi-bin riscv64 5.0.9-1 [192 kB] Fetched 192 kB in 0s (3368 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdckthlt9/openmpi-bin_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libreadline8t64 riscv64 8.3-3 [181 kB] Fetched 181 kB in 0s (7480 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaulb9qrd/libreadline8t64_8.3-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libdpkg-perl all 1.22.21 [650 kB] Fetched 650 kB in 0s (19.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxczz4708/libdpkg-perl_1.22.21_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 tar riscv64 1.35+dfsg-3.1 [822 kB] Fetched 822 kB in 0s (21.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_vibjdqt/tar_1.35+dfsg-3.1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarpack2-dev riscv64 3.9.1-6 [192 kB] Fetched 192 kB in 0s (3367 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplqgwjcx7/libarpack2-dev_3.9.1-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libisl23 riscv64 0.27-1 [664 kB] Fetched 664 kB in 0s (19.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpgfoj4z7w/libisl23_0.27-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libattr1 riscv64 1:2.5.2-3 [22.9 kB] Fetched 22.9 kB in 0s (426 kB/s) dpkg-name: info: moved 'libattr1_1%3a2.5.2-3_riscv64.deb' to '/srv/rebuilderd/tmp/tmpdbdvrlng/libattr1_2.5.2-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc64-real3.24-dev riscv64 3.24.1+dfsg1-2 [13.5 MB] Fetched 13.5 MB in 5s (2683 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnba6_3mq/libpetsc64-real3.24-dev_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libldap2 riscv64 2.6.10+dfsg-1 [197 kB] Fetched 197 kB in 0s (8133 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7fmqkc0z/libldap2_2.6.10+dfsg-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsasl2-2 riscv64 2.1.28+dfsg1-10 [60.6 kB] Fetched 60.6 kB in 0s (2803 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5t4qd33z/libsasl2-2_2.1.28+dfsg1-10_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsemanage2 riscv64 3.9-1 [94.5 kB] Fetched 94.5 kB in 0s (1721 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcjthdt5p/libsemanage2_3.9-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 binutils-common riscv64 2.45-8 [2558 kB] Fetched 2558 kB in 0s (22.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpy82l5hv4/binutils-common_2.45-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libubsan1 riscv64 15.2.0-8 [1178 kB] Fetched 1178 kB in 0s (15.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_ar2kjr7/libubsan1_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libxdmcp-dev riscv64 1:1.1.5-1 [52.9 kB] Fetched 52.9 kB in 0s (2454 kB/s) dpkg-name: info: moved 'libxdmcp-dev_1%3a1.1.5-1_riscv64.deb' to '/srv/rebuilderd/tmp/tmpyuy_jogv/libxdmcp-dev_1.1.5-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-64-dev riscv64 7.0.10-2 [19.3 kB] Fetched 19.3 kB in 2s (9069 B/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk7ff5_rz/libscotch-64-dev_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libctf-nobfd0 riscv64 2.45-8 [163 kB] Fetched 163 kB in 0s (2884 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpsorpmpj7/libctf-nobfd0_2.45-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autopoint all 0.23.2-1 [772 kB] Fetched 772 kB in 0s (21.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp50arxepn/autopoint_0.23.2-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 mpi-default-dev riscv64 1.19 [3436 B] Fetched 3436 B in 0s (65.3 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppg0j8eby/mpi-default-dev_1.19_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libbrotli1 riscv64 1.1.0-2+b7 [358 kB] Fetched 358 kB in 0s (12.9 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8c7dx7de/libbrotli1_1.1.0-2+b7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenblas64-pthread-dev riscv64 0.3.30+ds-3 [8860 kB] Fetched 8860 kB in 0s (27.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplbl1cpkk/libopenblas64-pthread-dev_0.3.30+ds-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-7.0c riscv64 7.0.10-2 [258 kB] Fetched 258 kB in 0s (4416 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxsa14gud/libscotch-7.0c_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libklu2 riscv64 1:7.11.0+dfsg-2 [91.9 kB] Fetched 91.9 kB in 0s (1675 kB/s) dpkg-name: info: moved 'libklu2_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpofbm35qo/libklu2_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-fortran-310 riscv64 1.14.5+repack-4 [43.7 kB] Fetched 43.7 kB in 0s (811 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo05m4t8t/libhdf5-openmpi-hl-fortran-310_1.14.5+repack-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-hl-310 riscv64 1.14.5+repack-4 [71.8 kB] Fetched 71.8 kB in 0s (1307 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvomqkq12/libhdf5-openmpi-hl-310_1.14.5+repack-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 dh-python all 6.20250414 [116 kB] Fetched 116 kB in 0s (2058 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmps9zbo597/dh-python_6.20250414_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 g++-riscv64-linux-gnu riscv64 4:15.2.0-4 [1196 B] Fetched 1196 B in 0s (58.9 kB/s) dpkg-name: info: moved 'g++-riscv64-linux-gnu_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmp2_ooyoke/g++-riscv64-linux-gnu_15.2.0-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgcc-s1 riscv64 15.2.0-8 [61.5 kB] Fetched 61.5 kB in 0s (1139 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp0957obpd/libgcc-s1_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 liblapack3 riscv64 3.12.1-7 [1971 kB] Fetched 1971 kB in 0s (32.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzy1l87t9/liblapack3_3.12.1-7_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gfortran-riscv64-linux-gnu riscv64 4:15.2.0-4 [1284 B] Fetched 1284 B in 0s (24.4 kB/s) dpkg-name: info: moved 'gfortran-riscv64-linux-gnu_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpvlhra1kr/gfortran-riscv64-linux-gnu_15.2.0-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libcholmod5 riscv64 1:7.11.0+dfsg-2 [653 kB] Fetched 653 kB in 0s (9346 kB/s) dpkg-name: info: moved 'libcholmod5_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp6lzwwy96/libcholmod5_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfido2-1 riscv64 1.16.0-2 [83.4 kB] Fetched 83.4 kB in 0s (1519 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpt6atm018/libfido2-1_1.16.0-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 fortran-fpm riscv64 0.12.0-5 [510 kB] Fetched 510 kB in 0s (8027 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjv00vjkm/fortran-fpm_0.12.0-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libopenmpi-dev riscv64 5.0.9-1 [1090 kB] Fetched 1090 kB in 0s (13.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvepltmm0/libopenmpi-dev_5.0.9-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 gzip riscv64 1.13-1 [139 kB] Fetched 139 kB in 0s (5996 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl1rvwjs6/gzip_1.13-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgfortran5 riscv64 15.2.0-8 [420 kB] Fetched 420 kB in 0s (14.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnk0rlor7/libgfortran5_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-64-7.0 riscv64 7.0.10-2 [258 kB] Fetched 258 kB in 2s (121 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp47sakz4j/libscotch-64-7.0_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgssapi-krb5-2 riscv64 1.22.1-2 [141 kB] Fetched 141 kB in 0s (6141 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjn9jt9kj/libgssapi-krb5-2_1.22.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libc6 riscv64 2.41-12 [2472 kB] Fetched 2472 kB in 0s (22.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplbu2dior/libc6_2.41-12_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgdbm-compat4t64 riscv64 1.26-1 [52.9 kB] Fetched 52.9 kB in 0s (2543 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpr07gtjy6/libgdbm-compat4t64_1.26-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-mpi-dev riscv64 1.14.5+repack-4 [18.8 kB] Fetched 18.8 kB in 0s (356 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzeaycc9q/libhdf5-mpi-dev_1.14.5+repack-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgnutls28-dev riscv64 3.8.10-3 [2859 kB] Fetched 2859 kB in 0s (23.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzrh6prtw/libgnutls28-dev_3.8.10-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 autotools-dev all 20240727.1 [60.2 kB] Fetched 60.2 kB in 0s (2796 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpczljud44/autotools-dev_20240727.1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotch-64i-7.0 riscv64 7.0.10-2 [153 kB] Fetched 153 kB in 0s (2699 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo8t3lqel/libptscotch-64i-7.0_7.0.10-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libsepol2 riscv64 3.9-2 [306 kB] Fetched 306 kB in 0s (11.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpmz58bp3j/libsepol2_3.9-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 groff-base riscv64 1.23.0-9 [1163 kB] Fetched 1163 kB in 0s (25.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpogdyq0ll/groff-base_1.23.0-9_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libcombblas2.0.0t64 riscv64 2.0.0-7 [265 kB] Fetched 265 kB in 0s (4505 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa_vegy4d/libcombblas2.0.0t64_2.0.0-7_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libncursesw6 riscv64 6.5+20251115-2 [141 kB] Fetched 141 kB in 0s (2500 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmptz6mppa0/libncursesw6_6.5+20251115-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 dh-autoreconf all 21 [12.2 kB] Fetched 12.2 kB in 0s (612 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp9bzt_g5h/dh-autoreconf_21_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 tzdata all 2025b-5 [260 kB] Fetched 260 kB in 0s (10.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpqdt2j4tp/tzdata_2025b-5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmumps-64pord-5.8 riscv64 5.8.1-2 [1838 kB] Fetched 1838 kB in 0s (20.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmplj4c26ql/libmumps-64pord-5.8_5.8.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libarchive-zip-perl all 1.68-1 [104 kB] Fetched 104 kB in 0s (4670 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaljgtvaq/libarchive-zip-perl_1.68-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 gfortran-15 riscv64 15.2.0-8 [18.3 kB] Fetched 18.3 kB in 0s (343 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpjj61g0qv/gfortran-15_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre64-dev riscv64 2.33.0-3 [4393 kB] Fetched 4393 kB in 0s (30.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp57naftcw/libhypre64-dev_2.33.0-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpsl5t64 riscv64 0.21.2-1.1+b1 [57.3 kB] Fetched 57.3 kB in 0s (2686 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpkie1f20w/libpsl5t64_0.21.2-1.1+b1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libsuperlu-dist-dev riscv64 9.2.0+dfsg1-4 [5010 kB] Fetched 5010 kB in 0s (29.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnihbg48k/libsuperlu-dist-dev_9.2.0+dfsg1-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libdebconfclient0 riscv64 0.281 [10.6 kB] Fetched 10.6 kB in 0s (200 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpht0x12kc/libdebconfclient0_0.281_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251124T032852Z sid/main riscv64 libpetsc-real3.24-dev riscv64 3.24.1+dfsg1-2 [13.5 MB] Fetched 13.5 MB in 0s (34.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpnohne_vu/libpetsc-real3.24-dev_3.24.1+dfsg1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpsl-dev riscv64 0.21.2-1.1+b1 [87.2 kB] Fetched 87.2 kB in 0s (1584 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpcns7asah/libpsl-dev_0.21.2-1.1+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 patch riscv64 2.8-2 [134 kB] Fetched 134 kB in 0s (5889 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpn8kgtu_s/patch_2.8-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 netbase all 6.5 [12.4 kB] Fetched 12.4 kB in 0s (630 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvp8n_7tb/netbase_6.5_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 nettle-dev riscv64 3.10.2-1 [1557 kB] Fetched 1557 kB in 0s (16.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp5vz_n3jl/nettle-dev_3.10.2-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 findutils riscv64 4.10.0-3 [706 kB] Fetched 706 kB in 0s (10.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpe84phaki/findutils_4.10.0-3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 cpp-15-riscv64-linux-gnu riscv64 15.2.0-8 [14.8 MB] Fetched 14.8 MB in 1s (26.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyfofcz6s/cpp-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 debhelper all 13.28 [941 kB] Fetched 941 kB in 0s (23.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp85m88zvw/debhelper_13.28_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 bzip2 riscv64 1.0.8-6 [40.5 kB] Fetched 40.5 kB in 0s (1944 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmppd7mhj7k/bzip2_1.0.8-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 chrpath riscv64 0.18-1 [14.0 kB] Fetched 14.0 kB in 0s (265 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoaamejeu/chrpath_0.18-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 gfortran-15-riscv64-linux-gnu riscv64 15.2.0-8 [15.2 MB] Fetched 15.2 MB in 0s (41.5 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpev8an6zq/gfortran-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnuma1 riscv64 2.0.19-1 [23.2 kB] Fetched 23.2 kB in 0s (1144 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp62xfpcdw/libnuma1_2.0.19-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnettle8t64 riscv64 3.10.2-1 [332 kB] Fetched 332 kB in 0s (12.0 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp580ukz73/libnettle8t64_3.10.2-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libtsan2 riscv64 15.2.0-8 [2653 kB] Fetched 2653 kB in 0s (24.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzfc2yvvr/libtsan2_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libparpack2-dev riscv64 3.9.1-6 [171 kB] Fetched 171 kB in 0s (3011 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8hi45bw7/libparpack2-dev_3.9.1-6_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 file riscv64 1:5.46-5 [43.4 kB] Fetched 43.4 kB in 0s (797 kB/s) dpkg-name: info: moved 'file_1%3a5.46-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmp9og15l17/file_5.46-5_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libptscotcherr-7.0 riscv64 7.0.10-2 [12.1 kB] Fetched 12.1 kB in 0s (229 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpthv0xyxa/libptscotcherr-7.0_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 comerr-dev riscv64 2.1-1.47.2-3+b3 [60.1 kB] Fetched 60.1 kB in 0s (1116 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpxaq4hy6s/comerr-dev_2.1-1.47.2-3+b3_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libsuitesparse-dev riscv64 1:7.11.0+dfsg-2 [4195 kB] Fetched 4195 kB in 0s (29.7 MB/s) dpkg-name: info: moved 'libsuitesparse-dev_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmp_sxas7ca/libsuitesparse-dev_7.11.0+dfsg-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgcc-15-dev riscv64 15.2.0-8 [5663 kB] Fetched 5663 kB in 0s (33.1 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd7yhjf57/libgcc-15-dev_15.2.0-8_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libccolamd3 riscv64 1:7.11.0+dfsg-2 [47.0 kB] Fetched 47.0 kB in 0s (2251 kB/s) dpkg-name: info: moved 'libccolamd3_1%3a7.11.0+dfsg-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpb4q4vena/libccolamd3_7.11.0+dfsg-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjpeg62-turbo-dev riscv64 1:2.1.5-4 [420 kB] Fetched 420 kB in 0s (6786 kB/s) dpkg-name: info: moved 'libjpeg62-turbo-dev_1%3a2.1.5-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpda2fzulj/libjpeg62-turbo-dev_2.1.5-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 librtmp-dev riscv64 2.4+20151223.gitfa8646d.1-3 [119 kB] Fetched 119 kB in 0s (2141 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpduu6_ieb/librtmp-dev_2.4+20151223.gitfa8646d.1-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 bash riscv64 5.3-1 [1560 kB] Fetched 1560 kB in 0s (17.4 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpywyr3i5v/bash_5.3-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libmagic1t64 riscv64 1:5.46-5 [117 kB] Fetched 117 kB in 0s (5144 kB/s) dpkg-name: info: moved 'libmagic1t64_1%3a5.46-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpf2edo64v/libmagic1t64_5.46-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libngtcp2-crypto-ossl-dev riscv64 1.16.0-1 [51.8 kB] Fetched 51.8 kB in 0s (961 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1c0uejyt/libngtcp2-crypto-ossl-dev_1.16.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 coreutils riscv64 9.7-3 [3036 kB] Fetched 3036 kB in 0s (25.2 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp20bp02ra/coreutils_9.7-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libaec-dev riscv64 1.1.4-2 [42.0 kB] Fetched 42.0 kB in 0s (783 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq7w6soyl/libaec-dev_1.1.4-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libunbound8 riscv64 1.24.1-2 [616 kB] Fetched 616 kB in 0s (9342 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp_3z3pmv7/libunbound8_1.24.1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 g++-15 riscv64 15.2.0-8 [24.4 kB] Fetched 24.4 kB in 0s (458 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpl6nr1a5m/g++-15_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-single3 riscv64 3.3.10-2+b1 [381 kB] Fetched 381 kB in 0s (6247 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpvix1qbb5/libfftw3-single3_3.3.10-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-310 riscv64 1.14.5+repack-4 [1446 kB] Fetched 1446 kB in 0s (17.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa7pdiu1c/libhdf5-openmpi-310_1.14.5+repack-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 openssl-provider-legacy riscv64 3.5.4-1 [310 kB] Fetched 310 kB in 0s (11.6 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpzt4y1_nu/openssl-provider-legacy_3.5.4-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 gcc-15-riscv64-linux-gnu riscv64 15.2.0-8 [28.7 MB] Fetched 28.7 MB in 1s (35.7 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpd907bd9j/gcc-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb' Downloading dependency 248 of 393: libnghttp2-dev:riscv64=1.64.0-1.1+b1 Downloading dependency 249 of 393: libpetsc-complex3.24-dev:riscv64=3.24.1+dfsg1-2 Downloading dependency 250 of 393: libyaml-dev:riscv64=0.2.5-2 Downloading dependency 251 of 393: dh-fortran:riscv64=0.57 Downloading dependency 252 of 393: libgprofng0:riscv64=2.45-8 Downloading dependency 253 of 393: libscalapack-openmpi2.2:riscv64=2.2.2-2 Downloading dependency 254 of 393: libscotcherr-7.0:riscv64=7.0.10-2 Downloading dependency 255 of 393: libarpack2t64:riscv64=3.9.1-6 Downloading dependency 256 of 393: init-system-helpers:riscv64=1.69 Downloading dependency 257 of 393: libzstd-dev:riscv64=1.5.7+dfsg-2 Downloading dependency 258 of 393: libgmpxx4ldbl:riscv64=2:6.3.0+dfsg-5 Downloading dependency 259 of 393: libevent-dev:riscv64=2.1.12-stable-10+b1 Downloading dependency 260 of 393: automake:riscv64=1:1.18.1-3 Downloading dependency 261 of 393: libpipeline1:riscv64=1.5.8-1 Downloading dependency 262 of 393: libptscotch-dev:riscv64=7.0.10-2 Downloading dependency 263 of 393: libblas64-3:riscv64=3.12.1-7 Downloading dependency 264 of 393: libaudit1:riscv64=1:4.1.2-1 Downloading dependency 265 of 393: gettext-base:riscv64=0.23.2-1 Downloading dependency 266 of 393: libpetsc64-complex3.24-dev:riscv64=3.24.1+dfsg1-2 Downloading dependency 267 of 393: po-debconf:riscv64=1.0.21+nmu1 Downloading dependency 268 of 393: libstdc++-15-dev:riscv64=15.2.0-8 Downloading dependency 269 of 393: perl-modules-5.40:riscv64=5.40.1-7 Downloading dependency 270 of 393: ocl-icd-libopencl1:riscv64=2.3.4-1 Downloading dependency 271 of 393: opencl-c-headers:riscv64=3.0~2025.07.22-2 Downloading dependency 272 of 393: libspex3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 273 of 393: libcc1-0:riscv64=15.2.0-8 Downloading dependency 274 of 393: libfortran-jonquil-0:riscv64=0.3.0-3 Downloading dependency 275 of 393: pkgconf:riscv64=1.8.1-4 Downloading dependency 276 of 393: libsframe2:riscv64=2.45-8 Downloading dependency 277 of 393: bsdextrautils:riscv64=2.41.2-4 Downloading dependency 278 of 393: linux-libc-dev:riscv64=6.17.8-1 Downloading dependency 279 of 393: openmpi-bin:riscv64=5.0.9-1 Downloading dependency 280 of 393: libreadline8t64:riscv64=8.3-3 Downloading dependency 281 of 393: libdpkg-perl:riscv64=1.22.21 Downloading dependency 282 of 393: tar:riscv64=1.35+dfsg-3.1 Downloading dependency 283 of 393: libarpack2-dev:riscv64=3.9.1-6 Downloading dependency 284 of 393: libisl23:riscv64=0.27-1 Downloading dependency 285 of 393: libattr1:riscv64=1:2.5.2-3 Downloading dependency 286 of 393: libpetsc64-real3.24-dev:riscv64=3.24.1+dfsg1-2 Downloading dependency 287 of 393: libldap2:riscv64=2.6.10+dfsg-1 Downloading dependency 288 of 393: libsasl2-2:riscv64=2.1.28+dfsg1-10 Downloading dependency 289 of 393: libsemanage2:riscv64=3.9-1 Downloading dependency 290 of 393: binutils-common:riscv64=2.45-8 Downloading dependency 291 of 393: libubsan1:riscv64=15.2.0-8 Downloading dependency 292 of 393: libxdmcp-dev:riscv64=1:1.1.5-1 Downloading dependency 293 of 393: libscotch-64-dev:riscv64=7.0.10-2 Downloading dependency 294 of 393: libctf-nobfd0:riscv64=2.45-8 Downloading dependency 295 of 393: autopoint:riscv64=0.23.2-1 Downloading dependency 296 of 393: mpi-default-dev:riscv64=1.19 Downloading dependency 297 of 393: libbrotli1:riscv64=1.1.0-2+b7 Downloading dependency 298 of 393: libopenblas64-pthread-dev:riscv64=0.3.30+ds-3 Downloading dependency 299 of 393: libscotch-7.0c:riscv64=7.0.10-2 Downloading dependency 300 of 393: libklu2:riscv64=1:7.11.0+dfsg-2 Downloading dependency 301 of 393: libhdf5-openmpi-hl-fortran-310:riscv64=1.14.5+repack-4 Downloading dependency 302 of 393: libhdf5-openmpi-hl-310:riscv64=1.14.5+repack-4 Downloading dependency 303 of 393: dh-python:riscv64=6.20250414 Downloading dependency 304 of 393: g++-riscv64-linux-gnu:riscv64=4:15.2.0-4 Downloading dependency 305 of 393: libgcc-s1:riscv64=15.2.0-8 Downloading dependency 306 of 393: liblapack3:riscv64=3.12.1-7 Downloading dependency 307 of 393: gfortran-riscv64-linux-gnu:riscv64=4:15.2.0-4 Downloading dependency 308 of 393: libcholmod5:riscv64=1:7.11.0+dfsg-2 Downloading dependency 309 of 393: libfido2-1:riscv64=1.16.0-2 Downloading dependency 310 of 393: fortran-fpm:riscv64=0.12.0-5 Downloading dependency 311 of 393: libopenmpi-dev:riscv64=5.0.9-1 Downloading dependency 312 of 393: gzip:riscv64=1.13-1 Downloading dependency 313 of 393: libgfortran5:riscv64=15.2.0-8 Downloading dependency 314 of 393: libscotch-64-7.0:riscv64=7.0.10-2 Downloading dependency 315 of 393: libgssapi-krb5-2:riscv64=1.22.1-2 Downloading dependency 316 of 393: libc6:riscv64=2.41-12 Downloading dependency 317 of 393: libgdbm-compat4t64:riscv64=1.26-1 Downloading dependency 318 of 393: libhdf5-mpi-dev:riscv64=1.14.5+repack-4 Downloading dependency 319 of 393: libgnutls28-dev:riscv64=3.8.10-3 Downloading dependency 320 of 393: autotools-dev:riscv64=20240727.1 Downloading dependency 321 of 393: libptscotch-64i-7.0:riscv64=7.0.10-2 Downloading dependency 322 of 393: libsepol2:riscv64=3.9-2 Downloading dependency 323 of 393: groff-base:riscv64=1.23.0-9 Downloading dependency 324 of 393: libcombblas2.0.0t64:riscv64=2.0.0-7 Downloading dependency 325 of 393: libncursesw6:riscv64=6.5+20251115-2 Downloading dependency 326 of 393: dh-autoreconf:riscv64=21 Downloading dependency 327 of 393: tzdata:riscv64=2025b-5 Downloading dependency 328 of 393: libmumps-64pord-5.8:riscv64=5.8.1-2 Downloading dependency 329 of 393: libarchive-zip-perl:riscv64=1.68-1 Downloading dependency 330 of 393: gfortran-15:riscv64=15.2.0-8 Downloading dependency 331 of 393: libhypre64-dev:riscv64=2.33.0-3 Downloading dependency 332 of 393: libpsl5t64:riscv64=0.21.2-1.1+b1 Downloading dependency 333 of 393: libsuperlu-dist-dev:riscv64=9.2.0+dfsg1-4 Downloading dependency 334 of 393: libdebconfclient0:riscv64=0.281 Downloading dependency 335 of 393: libpetsc-real3.24-dev:riscv64=3.24.1+dfsg1-2 Downloading dependency 336 of 393: libpsl-dev:riscv64=0.21.2-1.1+b1 Downloading dependency 337 of 393: patch:riscv64=2.8-2 Downloading dependency 338 of 393: netbase:riscv64=6.5 Downloading dependency 339 of 393: nettle-dev:riscv64=3.10.2-1 Downloading dependency 340 of 393: findutils:riscv64=4.10.0-3 Downloading dependency 341 of 393: cpp-15-riscv64-linux-gnu:riscv64=15.2.0-8 Downloading dependency 342 of 393: debhelper:riscv64=13.28 Downloading dependency 343 of 393: bzip2:riscv64=1.0.8-6 Downloading dependency 344 of 393: chrpath:riscv64=0.18-1 Downloading dependency 345 of 393: gfortran-15-riscv64-linux-gnu:riscv64=15.2.0-8 Downloading dependency 346 of 393: libnuma1:riscv64=2.0.19-1 Downloading dependency 347 of 393: libnettle8t64:riscv64=3.10.2-1 Downloading dependency 348 of 393: libtsan2:riscv64=15.2.0-8 Downloading dependency 349 of 393: libparpack2-dev:riscv64=3.9.1-6 Downloading dependency 350 of 393: file:riscv64=1:5.46-5 Downloading dependency 351 of 393: libptscotcherr-7.0:riscv64=7.0.10-2 Downloading dependency 352 of 393: comerr-dev:riscv64=2.1-1.47.2-3+b3 Downloading dependency 353 of 393: libsuitesparse-dev:riscv64=1:7.11.0+dfsg-2 Downloading dependency 354 of 393: libgcc-15-dev:riscv64=15.2.0-8 Downloading dependency 355 of 393: libccolamd3:riscv64=1:7.11.0+dfsg-2 Downloading dependency 356 of 393: libjpeg62-turbo-dev:riscv64=1:2.1.5-4 Downloading dependency 357 of 393: librtmp-dev:riscv64=2.4+20151223.gitfa8646d.1-3 Downloading dependency 358 of 393: bash:riscv64=5.3-1 Downloading dependency 359 of 393: libmagic1t64:riscv64=1:5.46-5 Downloading dependency 360 of 393: libngtcp2-crypto-ossl-dev:riscv64=1.16.0-1 Downloading dependency 361 of 393: coreutils:riscv64=9.7-3 Downloading dependency 362 of 393: libaec-dev:riscv64=1.1.4-2 Downloading dependency 363 of 393: libunbound8:riscv64=1.24.1-2 Downloading dependency 364 of 393: g++-15:riscv64=15.2.0-8 Downloading dependency 365 of 393: libfftw3-single3:riscv64=3.3.10-2+b1 Downloading dependency 366 of 393: libhdf5-openmpi-310:riscv64=1.14.5+repack-4 Downloading dependency 367 of 393: openssl-provider-legacy:riscv64=3.5.4-1 Downloading dependency 368 of 393: gcc-15-riscv64-linux-gnu:riscv64=15.2.0-8 Downloading dependency 369 of 393: libhypre-dev:riscv64=2.33.0-3Get:1 http://deb.debian.org/debian unstable/main riscv64 libhypre-dev riscv64 2.33.0-3 [4506 kB] Fetched 4506 kB in 0s (30.8 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpyxj8wu2n/libhypre-dev_2.33.0-3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 ocl-icd-opencl-dev riscv64 2.3.4-1 [8868 B] Fetched 8868 B in 0s (167 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp6r4ra9rd/ocl-icd-opencl-dev_2.3.4-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-mpi3 riscv64 3.3.10-2+b1 [57.9 kB] Fetched 57.9 kB in 0s (1072 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp03_y6yeh/libfftw3-mpi3_3.3.10-2+b1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libhdf5-openmpi-cpp-310 riscv64 1.14.5+repack-4 [131 kB] Fetched 131 kB in 0s (2352 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpntvb8vsk/libhdf5-openmpi-cpp-310_1.14.5+repack-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpciaccess0 riscv64 0.17-3+b3 [51.6 kB] Fetched 51.6 kB in 0s (960 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp11zqzpi3/libpciaccess0_0.17-3+b3_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libpcre2-8-0 riscv64 10.46-1 [301 kB] Fetched 301 kB in 0s (11.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpknnlghyt/libpcre2-8-0_10.46-1_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libgomp1 riscv64 15.2.0-8 [131 kB] Fetched 131 kB in 0s (2345 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7wgerj9i/libgomp1_15.2.0-8_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libgmp-dev riscv64 2:6.3.0+dfsg-5 [1024 kB] Fetched 1024 kB in 0s (24.3 MB/s) dpkg-name: info: moved 'libgmp-dev_2%3a6.3.0+dfsg-5_riscv64.deb' to '/srv/rebuilderd/tmp/tmpbku_dthd/libgmp-dev_6.3.0+dfsg-5_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libjs-jquery all 3.7.1+dfsg+~3.5.33-1 [319 kB] Fetched 319 kB in 0s (5324 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp8u4p8a24/libjs-jquery_3.7.1+dfsg+~3.5.33-1_all.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libnl-route-3-200 riscv64 3.11.0-2 [193 kB] Fetched 193 kB in 0s (3383 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpdt4zcb2s/libnl-route-3-200_3.11.0-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libc-dev-bin riscv64 2.41-12 [57.4 kB] Fetched 57.4 kB in 0s (2753 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpae7gpdmq/libc-dev-bin_2.41-12_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 openssh-client riscv64 1:10.2p1-2 [1017 kB] Fetched 1017 kB in 0s (12.3 MB/s) dpkg-name: info: moved 'openssh-client_1%3a10.2p1-2_riscv64.deb' to '/srv/rebuilderd/tmp/tmpprao552q/openssh-client_10.2p1-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libscotch-dev riscv64 7.0.10-2 [1320 kB] Fetched 1320 kB in 0s (16.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpve_t6xj6/libscotch-dev_7.0.10-2_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libyaml-0-2 riscv64 0.2.5-2 [55.8 kB] Fetched 55.8 kB in 0s (2654 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpk_3sol75/libyaml-0-2_0.2.5-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libnghttp3-9 riscv64 1.12.0-1 [68.4 kB] Fetched 68.4 kB in 0s (3173 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpo3kgqgj5/libnghttp3-9_1.12.0-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libssh2-1t64 riscv64 1.11.1-1 [249 kB] Fetched 249 kB in 0s (9810 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpa183909f/libssh2-1t64_1.11.1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 debconf all 1.5.91 [121 kB] Fetched 121 kB in 0s (5397 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpoj_wvun5/debconf_1.5.91_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 cpp-riscv64-linux-gnu riscv64 4:15.2.0-4 [5284 B] Fetched 5284 B in 0s (257 kB/s) dpkg-name: info: moved 'cpp-riscv64-linux-gnu_4%3a15.2.0-4_riscv64.deb' to '/srv/rebuilderd/tmp/tmpf7ywohud/cpp-riscv64-linux-gnu_15.2.0-4_riscv64.deb' Get:1 http://snapshot.debian.org/archive/debian/20251120T202450Z sid/main riscv64 libudev1 riscv64 259~rc1-1 [158 kB] Fetched 158 kB in 0s (2788 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp1cguv5p2/libudev1_259~rc1-1_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 hostname riscv64 3.25 [10.7 kB] Fetched 10.7 kB in 0s (539 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpp6hz_bw7/hostname_3.25_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libk5crypto3 riscv64 1.22.1-2 [98.4 kB] Fetched 98.4 kB in 0s (4412 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpih4yvt8f/libk5crypto3_1.22.1-2_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libidn2-dev riscv64 2.3.8-4 [135 kB] Fetched 135 kB in 0s (2427 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpq25dzbjx/libidn2-dev_2.3.8-4_riscv64.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 x11proto-dev all 2024.1-1 [603 kB] Fetched 603 kB in 0s (18.3 MB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmpaf6su5vl/x11proto-dev_2024.1-1_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 login.defs all 1:4.18.0-2 [211 kB] Fetched 211 kB in 0s (8611 kB/s) dpkg-name: info: moved 'login.defs_1%3a4.18.0-2_all.deb' to '/srv/rebuilderd/tmp/tmpliihwjno/login.defs_4.18.0-2_all.deb' Get:1 http://deb.debian.org/debian unstable/main riscv64 libfftw3-bin riscv64 3.3.10-2+b1 [44.9 kB] Fetched 44.9 kB in 0s (837 kB/s) dpkg-name: warning: skipping '/srv/rebuilderd/tmp/tmp7eu8rg3d/libfftw3-bin_3.3.10-2+b1_riscv64.deb' dpkg-buildpackage: info: source package debootsnap-dummy dpkg-buildpackage: info: source version 1.0 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Equivs Dummy Package Generator dpkg-source --before-build . dpkg-buildpackage: info: host architecture riscv64 debian/rules clean dh clean dh_clean debian/rules binary dh binary dh_update_autotools_config dh_autoreconf create-stamp debian/debhelper-build-stamp dh_prep dh_auto_install --destdir=debian/debootsnap-dummy/ dh_install dh_installdocs dh_installchangelogs dh_perl dh_link dh_strip_nondeterminism dh_compress dh_fixperms dh_missing dh_installdeb dh_gencontrol dh_md5sums dh_builddeb dpkg-deb: building package 'debootsnap-dummy' in '../debootsnap-dummy_1.0_all.deb'. dpkg-genbuildinfo --build=binary -O../debootsnap-dummy_1.0_riscv64.buildinfo dpkg-genchanges --build=binary -O../debootsnap-dummy_1.0_riscv64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) The package has been created. Attention, the package has been created in the /srv/rebuilderd/tmp/tmpc7yjv7bx/cache directory, not in ".." as indicated by the message above! I: automatically chosen mode: unshare I: chroot architecture riscv64 is equal to the host's architecture I: using /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re as tempdir I: running --setup-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/setup00.sh /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re 127.0.0.1 - - [05/Jan/2026 18:30:19] code 404, message File not found 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./InRelease HTTP/1.1" 404 - Ign:1 http://localhost:37595 ./ InRelease 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./Release HTTP/1.1" 200 - Get:2 http://localhost:37595 ./ Release [462 B] 127.0.0.1 - - [05/Jan/2026 18:30:19] code 404, message File not found 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./Release.gpg HTTP/1.1" 404 - Ign:3 http://localhost:37595 ./ Release.gpg 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./Packages HTTP/1.1" 200 - Get:4 http://localhost:37595 ./ Packages [504 kB] Fetched 505 kB in 0s (8932 kB/s) Reading package lists... usr-is-merged found but not real -- not running merged-usr setup hook I: skipping apt-get update because it was already run I: downloading packages with apt... 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./gcc-15-base_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libc6_2.41-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libgcc-s1_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./mawk_1.3.4.20250131-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./base-files_14_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libtinfo6_6.5%2b20251115-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./debianutils_5.23.2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./bash_5.3-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libacl1_2.3.2-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libattr1_2.5.2-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libcap2_2.75-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libgmp10_6.3.0%2bdfsg-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libpcre2-8-0_10.46-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libselinux1_3.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libzstd1_1.5.7%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./zlib1g_1.3.dfsg%2breally1.3.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libssl3t64_3.5.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./openssl-provider-legacy_3.5.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libsystemd0_259%7erc1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./coreutils_9.7-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./dash_0.5.12-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./diffutils_3.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libbz2-1.0_1.0.8-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./liblzma5_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./libmd0_1.1.0-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./tar_1.35%2bdfsg-3.1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./dpkg_1.22.21_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:19] "GET /./findutils_4.10.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./grep_3.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./gzip_1.13-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./hostname_3.25_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./ncurses-bin_6.5%2b20250216-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libcrypt1_4.5.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./perl-base_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./sed_4.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libaudit-common_4.1.2-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libcap-ng0_0.8.5-4%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libaudit1_4.1.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libdb5.3t64_5.3.28%2bdfsg2-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./debconf_1.5.91_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libpam0g_1.7.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libpam-modules-bin_1.7.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libpam-modules_1.7.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libpam-runtime_1.7.0-5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libblkid1_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libmount1_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libsmartcols1_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libudev1_259%7erc1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libuuid1_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./util-linux_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libdebconfclient0_0.281_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./base-passwd_3.6.8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./init-system-helpers_1.69_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./libc-bin_2.41-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./ncurses-base_6.5%2b20250216-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:20] "GET /./sysvinit-utils_3.15-6_riscv64.deb HTTP/1.1" 200 - I: extracting archives... I: running --extract-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/extract00.sh /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re 127.0.0.1 - - [05/Jan/2026 18:30:23] code 404, message File not found 127.0.0.1 - - [05/Jan/2026 18:30:23] "GET /./InRelease HTTP/1.1" 404 - Ign:1 http://localhost:37595 ./ InRelease 127.0.0.1 - - [05/Jan/2026 18:30:23] "GET /./Release HTTP/1.1" 304 - Hit:2 http://localhost:37595 ./ Release 127.0.0.1 - - [05/Jan/2026 18:30:23] code 404, message File not found 127.0.0.1 - - [05/Jan/2026 18:30:23] "GET /./Release.gpg HTTP/1.1" 404 - Ign:3 http://localhost:37595 ./ Release.gpg Reading package lists... usr-is-merged found but not real -- not running merged-usr extract hook I: installing essential packages... I: running --essential-hook directly: /usr/share/mmdebstrap/hooks/maybe-merged-usr/essential00.sh /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re usr-is-merged was not installed in a previous hook -- not running merged-usr essential hook I: installing remaining packages inside the chroot... 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libexpat1_2.7.3-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libpython3.13-minimal_3.13.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./python3.13-minimal_3.13.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./python3-minimal_3.13.7-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./media-types_14.0.0_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./netbase_6.5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./tzdata_2025b-5_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libffi8_3.5.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libncursesw6_6.5%2b20251115-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./readline-common_8.3-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libreadline8t64_8.3-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libsqlite3-0_3.46.1-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libpython3.13-stdlib_3.13.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./python3.13_3.13.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libpython3-stdlib_3.13.7-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./python3_3.13.7-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./login.defs_4.18.0-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libbsd0_0.12.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libsemanage-common_3.9-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libsepol2_3.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libsemanage2_3.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./passwd_4.18.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./sensible-utils_0.0.26_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./adduser_3.153_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libstdc%2b%2b6_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libuchardet0_0.0.8-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./groff-base_1.23.0-9_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./bsdextrautils_2.41.2-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libgdbm6t64_1.26-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libpipeline1_1.5.8-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./man-db_2.13.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./bzip2_1.0.8-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libmagic-mgc_5.46-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libmagic1t64_5.46-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./file_5.46-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./gettext-base_0.23.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libedit2_3.1-20250104-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libcbor0.10_0.10.2-2.1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libfido2-1_1.16.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libkrb5support0_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libcom-err2_1.47.2-3%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libk5crypto3_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libkeyutils1_1.6.3-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libkrb5-3_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libgssapi-krb5-2_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./openssh-client_10.2p1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./perl-modules-5.40_5.40.1-7_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libgdbm-compat4t64_1.26-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:35] "GET /./libperl5.40_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./perl_5.40.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./xz-utils_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./m4_1.4.20-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./autoconf_2.72-3.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./autotools-dev_20240727.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./automake_1.18.1-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./autopoint_0.23.2-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libsframe2_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./binutils-common_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libbinutils_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libgprofng0_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libctf-nobfd0_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libctf0_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libjansson4_2.14-2%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./binutils-riscv64-linux-gnu_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./binutils_2.45-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libc-dev-bin_2.41-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./linux-libc-dev_6.17.8-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libcrypt-dev_4.5.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./rpcsvc-proto_1.4.3-1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libc6-dev_2.41-12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libisl23_0.27-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libmpfr6_4.2.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./libmpc3_1.3.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:36] "GET /./cpp-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./cpp-15_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./cpp-riscv64-linux-gnu_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./cpp_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libcc1-0_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libgomp1_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libitm1_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libatomic1_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libasan8_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./liblsan0_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libtsan2_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libubsan1_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./libgcc-15-dev_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:37] "GET /./gcc-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:38] "GET /./gcc-15_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:38] "GET /./gcc-riscv64-linux-gnu_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:38] "GET /./gcc_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:38] "GET /./libstdc%2b%2b-15-dev_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:38] "GET /./g%2b%2b-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./g%2b%2b-15_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./g%2b%2b-riscv64-linux-gnu_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./g%2b%2b_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./make_4.4.1-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libdpkg-perl_1.22.21_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./patch_2.8-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./dpkg-dev_1.22.21_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./build-essential_12.12_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./chrpath_0.18-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./comerr-dev_2.1-1.47.2-3%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libdebhelper-perl_13.28_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libtool_2.5.4-7_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./dh-autoreconf_21_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libarchive-zip-perl_1.68-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libfile-stripnondeterminism-perl_1.15.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./dh-strip-nondeterminism_1.15.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libelf1t64_0.194-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./dwz_0.16-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libunistring5_1.3-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxml2-16_2.15.1%2bdfsg-0.4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./gettext_0.23.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./intltool-debian_0.35.0%2b20060710.6_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./po-debconf_1.0.21%2bnmu1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./debhelper_13.28_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libnettle8t64_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libhogweed6t64_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libidn2-0_2.3.8-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libp11-kit0_0.25.10-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libtasn1-6_4.20.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libgnutls30t64_3.8.10-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libgnutls-openssl27t64_3.8.10-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libgfortran5_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libgfortran-15-dev_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxau6_1.0.11-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxdmcp6_1.1.5-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxcb1_1.17.0-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./zlib1g-dev_1.3.dfsg%2breally1.3.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./xtrans-dev_1.6.0-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libx11-data_1.8.12-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libx11-6_1.8.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./xorg-sgml-doctools_1.11-1.1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./x11proto-dev_2024.1-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxau-dev_1.0.11-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxdmcp-dev_1.1.5-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libxcb1-dev_1.17.0-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libx11-dev_1.8.12-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libmumps-headers-dev_5.8.1-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./libblas3_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:39] "GET /./liblapack3_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libevent-core-2.1-7t64_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libevent-pthreads-2.1-7t64_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnl-3-200_3.11.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnl-route-3-200_3.11.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libibverbs1_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./ibverbs-providers_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./librdmacm1t64_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libfabric1_2.1.0-1.1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libhwloc15_2.12.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libmunge2_0.5.16-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libpciaccess0_0.17-3%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libxext6_1.3.4-1%2bb3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libxnvctrl0_535.171.04-1%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./ocl-icd-libopencl1_2.3.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libhwloc-plugins_2.12.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libpmix2t64_6.0.0%2breally5.0.9-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libfuse3-4_3.17.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libibumad3_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libibmad5_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libucx0_1.19.0%2bds-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libopenmpi40_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./openmpi-common_5.0.9-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./openmpi-bin_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./mpi-default-bin_1.19_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libscalapack-openmpi2.2_2.2.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libmumps-5.8_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libscalapack-openmpi-dev_2.2.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./gfortran-15-riscv64-linux-gnu_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./gfortran-15_15.2.0-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./gfortran-riscv64-linux-gnu_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./gfortran_15.2.0-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnl-3-dev_3.11.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnl-route-3-dev_3.11.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libibverbs-dev_56.1-1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnuma1_2.0.19-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:40] "GET /./libnuma-dev_2.0.19-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libltdl7_2.5.4-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libltdl-dev_2.5.4-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libhwloc-dev_2.12.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libevent-2.1-7t64_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libevent-extra-2.1-7t64_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libevent-openssl-2.1-7t64_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libevent-dev_2.1.12-stable-10%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libjs-jquery_3.7.1%2bdfsg%2b%7e3.5.33-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libjs-jquery-ui_1.13.2%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libopenmpi-dev_5.0.9-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./mpi-default-dev_1.19_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libscalapack-mpi-dev_2.2.2-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libmumps-dev_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libblas-dev_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libsasl2-modules-db_2.1.28%2bdfsg1-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libldl3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfftw3-double3_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libp11-kit-dev_0.25.10-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libcombblas2.0.0t64_2.0.0-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libmetis5_5.1.0.dfsg-8_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libptscotcherr-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libscotcherr-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libscotch-7.0c_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libptscotch-7.0c_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libsuperlu-dist9_9.2.0%2bdfsg1-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libhypre-2.33.0_2.33.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libsuitesparseconfig7_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libcolamd3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libsuperlu7_7.0.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libssl-dev_3.5.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libsuperlu-dev_7.0.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfftw3-long3_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfftw3-single3_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfftw3-bin_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfftw3-dev_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libngtcp2-16_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libngtcp2-dev_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libfile-libmagic-perl_1.23-2%2bb2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libmumps-64pord-5.8_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:41] "GET /./libmumps64-dev_5.8.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libopenblas64-0-pthread_0.3.30%2bds-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libopenblas64-0_0.3.30%2bds-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libamd3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libnghttp3-9_1.12.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libbrotli1_1.1.0-2%2bb7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libsasl2-2_2.1.28%2bdfsg1-10_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libldap2_2.6.10%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libnghttp2-14_1.64.0-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libngtcp2-crypto-ossl0_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libpsl5t64_0.21.2-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./librtmp1_2.4%2b20151223.gitfa8646d.1-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libssh2-1t64_1.11.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libcurl4t64_8.17.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libaec0_1.1.4-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libsz2_1.1.4-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libhdf5-openmpi-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libhdf5-openmpi-fortran-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./opencl-c-headers_3.0%7e2025.07.22-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./opencl-clhpp-headers_3.0%7e2025.07.22-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libblas64-3_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./liblapack64-3_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libhypre64-2.33.0_2.33.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libcamd3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libccolamd3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libcholmod5_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libumfpack6_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libparu1_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libssh2-1-dev_1.11.1-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libgssrpc4t64_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libkadm5clnt-mit12_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libkdb5-10t64_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libkadm5srv-mit12_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./krb5-multidev_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libkrb5-dev_1.22.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./python3-magic_0.4.27-3_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./python3-click_8.2.0%2b0.really.8.1.8-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libfortran-toml-0_0.4.3-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libfortran-jonquil-0_0.3.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./fortran-fpm_0.12.0-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./patchelf_0.18.0-1.4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./dh-fortran_0.57_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./dh-fortran-mod_0.57_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libscotcherr-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libscotch-64i-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libscotch-64i-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./fonts-mathjax_2.7.9%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libjs-mathjax_2.7.9%2bdfsg-1_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libcxsparse4_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libfftw3-mpi3_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libbtf2_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libklu2_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libspqr4_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libyaml-0-2_0.2.5-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:42] "GET /./libpetsc-complex3.24_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libopenblas64-pthread-dev_0.3.30%2bds-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libopenblas64-dev_0.3.30%2bds-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libunbound8_1.24.1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libgnutls-dane0t64_3.8.10-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libptscotch-64i-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libpetsc64-complex3.24_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libnghttp3-dev_1.12.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libfftw3-mpi-dev_3.3.10-2%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libbrotli-dev_1.1.0-2%2bb7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libjpeg62-turbo_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libjpeg62-turbo-dev_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./libjpeg-dev_2.1.5-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./librbio4_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:43] "GET /./liblapack-dev_3.12.1-7_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libsuitesparse-mongoose3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libspex3_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libsuitesparse-dev_7.11.0%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libptscotcherr-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libscotch-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libptscotch-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./ocl-icd-opencl-dev_2.3.4-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libhdf5-openmpi-hl-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libhdf5-openmpi-hl-fortran-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libhdf5-openmpi-cpp-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libhdf5-openmpi-hl-cpp-310_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libaec-dev_1.1.4-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libidn2-dev_2.3.8-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libldap-dev_2.6.10%2bdfsg-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libpkgconf3_1.8.1-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./pkgconf-bin_1.8.1-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./pkgconf_1.8.1-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libnghttp2-dev_1.64.0-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libngtcp2-crypto-ossl-dev_1.16.0-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libpsl-dev_0.21.2-1.1%2bb1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libgmpxx4ldbl_6.3.0%2bdfsg-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libgmp-dev_6.3.0%2bdfsg-5_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libtasn1-6-dev_4.20.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./nettle-dev_3.10.2-1_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libgnutls28-dev_3.8.10-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./librtmp-dev_2.4%2b20151223.gitfa8646d.1-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libzstd-dev_1.5.7%2bdfsg-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libcurl4-openssl-dev_8.17.0-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:44] "GET /./libhdf5-openmpi-dev_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libhdf5-mpi-dev_1.14.5%2brepack-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libsuperlu-dist-dev_9.2.0%2bdfsg1-4_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libyaml-dev_0.2.5-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libpetsc3.24-dev-common_3.24.1%2bdfsg1-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libptscotch-64i-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libpetsc64-real3.24_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libscotch-64-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libptscotch-64-7.0_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libarpack2t64_3.9.1-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libparpack2t64_3.9.1-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libpetsc3.24-dev-examples_3.24.1%2bdfsg1-2_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libscotch-64-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libptscotch-64-dev_7.0.10-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libpetsc-real3.24_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:45] "GET /./libpetsc-complex3.24-dev_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:46] "GET /./libpetsc64-complex3.24-dev_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:46] "GET /./libarpack2-dev_3.9.1-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:46] "GET /./libhypre64-dev_2.33.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:46] "GET /./libpetsc64-real3.24-dev_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:47] "GET /./dh-python_6.20250414_all.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:47] "GET /./libhypre-dev_2.33.0-3_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:47] "GET /./libpetsc-real3.24-dev_3.24.1%2bdfsg1-2_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:47] "GET /./libparpack2-dev_3.9.1-6_riscv64.deb HTTP/1.1" 200 - 127.0.0.1 - - [05/Jan/2026 18:30:47] "GET /./debootsnap-dummy_1.0_all.deb HTTP/1.1" 200 - I: running --customize-hook directly: /srv/rebuilderd/tmp/tmpc7yjv7bx/apt_install.sh /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re Reading package lists... Building dependency tree... Reading state information... libitm1 is already the newest version (15.2.0-8). libitm1 set to manually installed. libgnutls-openssl27t64 is already the newest version (3.8.10-3). libgnutls-openssl27t64 set to manually installed. libmagic-mgc is already the newest version (1:5.46-5). libmagic-mgc set to manually installed. libgfortran-15-dev is already the newest version (15.2.0-8). libgfortran-15-dev set to manually installed. libxcb1 is already the newest version (1.17.0-2+b1). libxcb1 set to manually installed. media-types is already the newest version (14.0.0). media-types set to manually installed. zlib1g-dev is already the newest version (1:1.3.dfsg+really1.3.1-1+b1). zlib1g-dev set to manually installed. xtrans-dev is already the newest version (1.6.0-1). xtrans-dev set to manually installed. libx11-dev is already the newest version (2:1.8.12-1). libx11-dev set to manually installed. libbsd0 is already the newest version (0.12.2-2). libbsd0 set to manually installed. libmumps-dev is already the newest version (5.8.1-2). libmumps-dev set to manually installed. libnuma-dev is already the newest version (2.0.19-1). libnuma-dev set to manually installed. libibverbs-dev is already the newest version (56.1-1+b1). libibverbs-dev set to manually installed. libevent-pthreads-2.1-7t64 is already the newest version (2.1.12-stable-10+b1). libevent-pthreads-2.1-7t64 set to manually installed. libcrypt-dev is already the newest version (1:4.5.1-1). libcrypt-dev set to manually installed. libjansson4 is already the newest version (2.14-2+b3). libjansson4 set to manually installed. sysvinit-utils is already the newest version (3.15-6). perl is already the newest version (5.40.1-7). perl set to manually installed. libxml2-16 is already the newest version (2.15.1+dfsg-0.4). libxml2-16 set to manually installed. libcap2 is already the newest version (1:2.75-10+b1). libmount1 is already the newest version (2.41.2-4). dwz is already the newest version (0.16-2). dwz set to manually installed. libblas-dev is already the newest version (3.12.1-7). libblas-dev set to manually installed. libsasl2-modules-db is already the newest version (2.1.28+dfsg1-10). libsasl2-modules-db set to manually installed. libexpat1 is already the newest version (2.7.3-1). libexpat1 set to manually installed. libhogweed6t64 is already the newest version (3.10.2-1). libhogweed6t64 set to manually installed. libevent-openssl-2.1-7t64 is already the newest version (2.1.12-stable-10+b1). libevent-openssl-2.1-7t64 set to manually installed. libtool is already the newest version (2.5.4-7). libtool set to manually installed. libstdc++6 is already the newest version (15.2.0-8). libstdc++6 set to manually installed. gettext is already the newest version (0.23.2-1). gettext set to manually installed. libldl3 is already the newest version (1:7.11.0+dfsg-2). libldl3 set to manually installed. libfftw3-double3 is already the newest version (3.3.10-2+b1). libfftw3-double3 set to manually installed. libtinfo6 is already the newest version (6.5+20251115-2). libelf1t64 is already the newest version (0.194-1). libelf1t64 set to manually installed. libp11-kit-dev is already the newest version (0.25.10-1). libp11-kit-dev set to manually installed. libhypre-2.33.0 is already the newest version (2.33.0-3). libhypre-2.33.0 set to manually installed. libcolamd3 is already the newest version (1:7.11.0+dfsg-2). libcolamd3 set to manually installed. libacl1 is already the newest version (2.3.2-2+b1). libsuperlu-dist9 is already the newest version (9.2.0+dfsg1-4). libsuperlu-dist9 set to manually installed. libsuperlu7 is already the newest version (7.0.1+dfsg1-2). libsuperlu7 set to manually installed. libpam-modules-bin is already the newest version (1.7.0-5). libcbor0.10 is already the newest version (0.10.2-2.1). libcbor0.10 set to manually installed. libzstd1 is already the newest version (1.5.7+dfsg-2). libmd0 is already the newest version (1.1.0-2+b1). libpython3-stdlib is already the newest version (3.13.7-1). libpython3-stdlib set to manually installed. libssl-dev is already the newest version (3.5.4-1). libssl-dev set to manually installed. libfabric1 is already the newest version (2.1.0-1.1). libfabric1 set to manually installed. libsuperlu-dev is already the newest version (7.0.1+dfsg1-2). libsuperlu-dev set to manually installed. libasan8 is already the newest version (15.2.0-8). libasan8 set to manually installed. libxext6 is already the newest version (2:1.3.4-1+b3). libxext6 set to manually installed. gcc is already the newest version (4:15.2.0-4). gcc set to manually installed. libfftw3-dev is already the newest version (3.3.10-2+b1). libfftw3-dev set to manually installed. dh-strip-nondeterminism is already the newest version (1.15.0-1). dh-strip-nondeterminism set to manually installed. libxnvctrl0 is already the newest version (535.171.04-1+b2). libxnvctrl0 set to manually installed. libngtcp2-dev is already the newest version (1.16.0-1). libngtcp2-dev set to manually installed. rpcsvc-proto is already the newest version (1.4.3-1+b2). rpcsvc-proto set to manually installed. libpmix2t64 is already the newest version (6.0.0+really5.0.9-2). libpmix2t64 set to manually installed. libfile-libmagic-perl is already the newest version (1.23-2+b2). libfile-libmagic-perl set to manually installed. zlib1g is already the newest version (1:1.3.dfsg+really1.3.1-1+b1). g++-15-riscv64-linux-gnu is already the newest version (15.2.0-8). g++-15-riscv64-linux-gnu set to manually installed. libmumps64-dev is already the newest version (5.8.1-2). libmumps64-dev set to manually installed. libopenblas64-0 is already the newest version (0.3.30+ds-3). libopenblas64-0 set to manually installed. debianutils is already the newest version (5.23.2). build-essential is already the newest version (12.12). build-essential set to manually installed. libamd3 is already the newest version (1:7.11.0+dfsg-2). libamd3 set to manually installed. libmumps-headers-dev is already the newest version (5.8.1-2). libmumps-headers-dev set to manually installed. libperl5.40 is already the newest version (5.40.1-7). libperl5.40 set to manually installed. python3-minimal is already the newest version (3.13.7-1). python3-minimal set to manually installed. libidn2-0 is already the newest version (2.3.8-4). libidn2-0 set to manually installed. libhdf5-openmpi-fortran-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-fortran-310 set to manually installed. opencl-clhpp-headers is already the newest version (3.0~2025.07.22-1). opencl-clhpp-headers set to manually installed. libopenblas64-0-pthread is already the newest version (0.3.30+ds-3). libopenblas64-0-pthread set to manually installed. python3 is already the newest version (3.13.7-1). python3 set to manually installed. libbz2-1.0 is already the newest version (1.0.8-6). ncurses-base is already the newest version (6.5+20250216-2). libhypre64-2.33.0 is already the newest version (2.33.0-3). libhypre64-2.33.0 set to manually installed. libmetis5 is already the newest version (5.1.0.dfsg-8). libmetis5 set to manually installed. libatomic1 is already the newest version (15.2.0-8). libatomic1 set to manually installed. libparu1 is already the newest version (1:7.11.0+dfsg-2). libparu1 set to manually installed. gfortran is already the newest version (4:15.2.0-4). gfortran set to manually installed. libnghttp2-14 is already the newest version (1.64.0-1.1+b1). libnghttp2-14 set to manually installed. liblapack64-3 is already the newest version (3.12.1-7). liblapack64-3 set to manually installed. intltool-debian is already the newest version (0.35.0+20060710.6). intltool-debian set to manually installed. libssh2-1-dev is already the newest version (1.11.1-1). libssh2-1-dev set to manually installed. libgnutls30t64 is already the newest version (3.8.10-3). libgnutls30t64 set to manually installed. libx11-6 is already the newest version (2:1.8.12-1). libx11-6 set to manually installed. dpkg-dev is already the newest version (1.22.21). dpkg-dev set to manually installed. binutils is already the newest version (2.45-8). binutils set to manually installed. libx11-data is already the newest version (2:1.8.12-1). libx11-data set to manually installed. libkrb5-dev is already the newest version (1.22.1-2). libkrb5-dev set to manually installed. dh-fortran-mod is already the newest version (0.57). dh-fortran-mod set to manually installed. libsuitesparseconfig7 is already the newest version (1:7.11.0+dfsg-2). libsuitesparseconfig7 set to manually installed. libumfpack6 is already the newest version (1:7.11.0+dfsg-2). libumfpack6 set to manually installed. mawk is already the newest version (1.3.4.20250131-1). libgmp10 is already the newest version (2:6.3.0+dfsg-5). libpam-runtime is already the newest version (1.7.0-5). libhwloc-plugins is already the newest version (2.12.2-1). libhwloc-plugins set to manually installed. libscotch-64i-dev is already the newest version (7.0.10-2). libscotch-64i-dev set to manually installed. libjs-mathjax is already the newest version (2.7.9+dfsg-1). libjs-mathjax set to manually installed. libcxsparse4 is already the newest version (1:7.11.0+dfsg-2). libcxsparse4 set to manually installed. libsmartcols1 is already the newest version (2.41.2-4). fonts-mathjax is already the newest version (2.7.9+dfsg-1). fonts-mathjax set to manually installed. libpetsc-complex3.24 is already the newest version (3.24.1+dfsg1-2). libpetsc-complex3.24 set to manually installed. libbtf2 is already the newest version (1:7.11.0+dfsg-2). libbtf2 set to manually installed. libxdmcp6 is already the newest version (1:1.1.5-1). libxdmcp6 set to manually installed. libevent-2.1-7t64 is already the newest version (2.1.12-stable-10+b1). libevent-2.1-7t64 set to manually installed. liblzma5 is already the newest version (5.8.1-2). libopenblas64-dev is already the newest version (0.3.30+ds-3). libopenblas64-dev set to manually installed. passwd is already the newest version (1:4.18.0-2). passwd set to manually installed. python3-click is already the newest version (8.2.0+0.really.8.1.8-1). python3-click set to manually installed. g++ is already the newest version (4:15.2.0-4). g++ set to manually installed. libsemanage-common is already the newest version (3.9-1). libsemanage-common set to manually installed. libblas3 is already the newest version (3.12.1-7). libblas3 set to manually installed. libxcb1-dev is already the newest version (1.17.0-2+b1). libxcb1-dev set to manually installed. libgnutls-dane0t64 is already the newest version (3.8.10-3). libgnutls-dane0t64 set to manually installed. adduser is already the newest version (3.153). adduser set to manually installed. m4 is already the newest version (1.4.20-2). m4 set to manually installed. libevent-core-2.1-7t64 is already the newest version (2.1.12-stable-10+b1). libevent-core-2.1-7t64 set to manually installed. libselinux1 is already the newest version (3.9-2). sensible-utils is already the newest version (0.0.26). sensible-utils set to manually installed. autoconf is already the newest version (2.72-3.1). autoconf set to manually installed. libfuse3-4 is already the newest version (3.17.4-1). libfuse3-4 set to manually installed. libpetsc64-complex3.24 is already the newest version (3.24.1+dfsg1-2). libpetsc64-complex3.24 set to manually installed. libnghttp3-dev is already the newest version (1.12.0-1). libnghttp3-dev set to manually installed. libscalapack-mpi-dev is already the newest version (2.2.2-2). libscalapack-mpi-dev set to manually installed. libfftw3-mpi-dev is already the newest version (3.3.10-2+b1). libfftw3-mpi-dev set to manually installed. libkdb5-10t64 is already the newest version (1.22.1-2). libkdb5-10t64 set to manually installed. dpkg is already the newest version (1.22.21). libbrotli-dev is already the newest version (1.1.0-2+b7). libbrotli-dev set to manually installed. libfftw3-long3 is already the newest version (3.3.10-2+b1). libfftw3-long3 set to manually installed. libpam0g is already the newest version (1.7.0-5). cpp is already the newest version (4:15.2.0-4). cpp set to manually installed. libjpeg-dev is already the newest version (1:2.1.5-4). libjpeg-dev set to manually installed. librbio4 is already the newest version (1:7.11.0+dfsg-2). librbio4 set to manually installed. python3-magic is already the newest version (2:0.4.27-3). python3-magic set to manually installed. gcc-15-base is already the newest version (15.2.0-8). libp11-kit0 is already the newest version (0.25.10-1). libp11-kit0 set to manually installed. libdb5.3t64 is already the newest version (5.3.28+dfsg2-10). libopenmpi40 is already the newest version (5.0.9-1). libopenmpi40 set to manually installed. diffutils is already the newest version (1:3.12-1). dash is already the newest version (0.5.12-12). libscotcherr-dev is already the newest version (7.0.10-2). libscotcherr-dev set to manually installed. libctf0 is already the newest version (2.45-8). libctf0 set to manually installed. base-files is already the newest version (14). liblapack-dev is already the newest version (3.12.1-7). liblapack-dev set to manually installed. libpetsc3.24-dev-common is already the newest version (3.24.1+dfsg1-2). libpetsc3.24-dev-common set to manually installed. libpkgconf3 is already the newest version (1.8.1-4). libpkgconf3 set to manually installed. libnl-3-200 is already the newest version (3.11.0-2). libnl-3-200 set to manually installed. libibverbs1 is already the newest version (56.1-1+b1). libibverbs1 set to manually installed. util-linux is already the newest version (2.41.2-4). libbinutils is already the newest version (2.45-8). libbinutils set to manually installed. libucx0 is already the newest version (1.19.0+ds-1+b1). libucx0 set to manually installed. libgssrpc4t64 is already the newest version (1.22.1-2). libgssrpc4t64 set to manually installed. libgdbm6t64 is already the newest version (1.26-1). libgdbm6t64 set to manually installed. grep is already the newest version (3.12-1). libuchardet0 is already the newest version (0.0.8-2). libuchardet0 set to manually installed. libdebhelper-perl is already the newest version (13.28). libdebhelper-perl set to manually installed. libsystemd0 is already the newest version (259~rc1-1). libsuitesparse-mongoose3 is already the newest version (1:7.11.0+dfsg-2). libsuitesparse-mongoose3 set to manually installed. libcurl4t64 is already the newest version (8.17.0-2). libcurl4t64 set to manually installed. python3.13-minimal is already the newest version (3.13.9-1). python3.13-minimal set to manually installed. binutils-riscv64-linux-gnu is already the newest version (2.45-8). binutils-riscv64-linux-gnu set to manually installed. gcc-15 is already the newest version (15.2.0-8). gcc-15 set to manually installed. libcamd3 is already the newest version (1:7.11.0+dfsg-2). libcamd3 set to manually installed. libhwloc15 is already the newest version (2.12.2-1). libhwloc15 set to manually installed. libffi8 is already the newest version (3.5.2-2). libffi8 set to manually installed. xorg-sgml-doctools is already the newest version (1:1.11-1.1). xorg-sgml-doctools set to manually installed. libhdf5-openmpi-hl-cpp-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-hl-cpp-310 set to manually installed. libpython3.13-stdlib is already the newest version (3.13.9-1). libpython3.13-stdlib set to manually installed. libngtcp2-crypto-ossl0 is already the newest version (1.16.0-1). libngtcp2-crypto-ossl0 set to manually installed. libscalapack-openmpi-dev is already the newest version (2.2.2-2). libscalapack-openmpi-dev set to manually installed. libltdl7 is already the newest version (2.5.4-7). libltdl7 set to manually installed. libcrypt1 is already the newest version (1:4.5.1-1). python3.13 is already the newest version (3.13.9-1). python3.13 set to manually installed. make is already the newest version (4.4.1-3). make set to manually installed. libblkid1 is already the newest version (2.41.2-4). libfile-stripnondeterminism-perl is already the newest version (1.15.0-1). libfile-stripnondeterminism-perl set to manually installed. libpython3.13-minimal is already the newest version (3.13.9-1). libpython3.13-minimal set to manually installed. libssl3t64 is already the newest version (3.5.4-1). libhdf5-openmpi-dev is already the newest version (1.14.5+repack-4). libhdf5-openmpi-dev set to manually installed. libxau6 is already the newest version (1:1.0.11-1). libxau6 set to manually installed. libedit2 is already the newest version (3.1-20250104-1). libedit2 set to manually installed. libaudit-common is already the newest version (1:4.1.2-1). libptscotch-64i-dev is already the newest version (7.0.10-2). libptscotch-64i-dev set to manually installed. libldap-dev is already the newest version (2.6.10+dfsg-1). libldap-dev set to manually installed. libkadm5srv-mit12 is already the newest version (1.22.1-2). libkadm5srv-mit12 set to manually installed. ibverbs-providers is already the newest version (56.1-1+b1). ibverbs-providers set to manually installed. libkrb5-3 is already the newest version (1.22.1-2). libkrb5-3 set to manually installed. libmpc3 is already the newest version (1.3.1-2). libmpc3 set to manually installed. libc6-dev is already the newest version (2.41-12). libc6-dev set to manually installed. libnl-3-dev is already the newest version (3.11.0-2). libnl-3-dev set to manually installed. libngtcp2-16 is already the newest version (1.16.0-1). libngtcp2-16 set to manually installed. libptscotch-7.0c is already the newest version (7.0.10-2). libptscotch-7.0c set to manually installed. openmpi-common is already the newest version (5.0.9-1). openmpi-common set to manually installed. libevent-extra-2.1-7t64 is already the newest version (2.1.12-stable-10+b1). libevent-extra-2.1-7t64 set to manually installed. libibmad5 is already the newest version (56.1-1+b1). libibmad5 set to manually installed. libpam-modules is already the newest version (1.7.0-5). libnl-route-3-dev is already the newest version (3.11.0-2). libnl-route-3-dev set to manually installed. libpetsc64-real3.24 is already the newest version (3.24.1+dfsg1-2). libpetsc64-real3.24 set to manually installed. libjpeg62-turbo is already the newest version (1:2.1.5-4). libjpeg62-turbo set to manually installed. librtmp1 is already the newest version (2.4+20151223.gitfa8646d.1-3). librtmp1 set to manually installed. libfortran-toml-0 is already the newest version (0.4.3-1). libfortran-toml-0 set to manually installed. libscotch-64i-7.0 is already the newest version (7.0.10-2). libscotch-64i-7.0 set to manually installed. xz-utils is already the newest version (5.8.1-2). xz-utils set to manually installed. libhwloc-dev is already the newest version (2.12.2-1). libhwloc-dev set to manually installed. libptscotch-64-7.0 is already the newest version (7.0.10-2). libptscotch-64-7.0 set to manually installed. libparpack2t64 is already the newest version (3.9.1-6). libparpack2t64 set to manually installed. libkrb5support0 is already the newest version (1.22.1-2). libkrb5support0 set to manually installed. libtasn1-6 is already the newest version (4.20.0-2). libtasn1-6 set to manually installed. krb5-multidev is already the newest version (1.22.1-2). krb5-multidev set to manually installed. liblsan0 is already the newest version (15.2.0-8). liblsan0 set to manually installed. libcap-ng0 is already the newest version (0.8.5-4+b1). sed is already the newest version (4.9-2). libc-bin is already the newest version (2.41-12). libcurl4-openssl-dev is already the newest version (8.17.0-2). libcurl4-openssl-dev set to manually installed. readline-common is already the newest version (8.3-3). readline-common set to manually installed. base-passwd is already the newest version (3.6.8). libcom-err2 is already the newest version (1.47.2-3+b3). libcom-err2 set to manually installed. libmunge2 is already the newest version (0.5.16-1). libmunge2 set to manually installed. ncurses-bin is already the newest version (6.5+20250216-2). pkgconf-bin is already the newest version (1.8.1-4). pkgconf-bin set to manually installed. libunistring5 is already the newest version (1.3-2). libunistring5 set to manually installed. librdmacm1t64 is already the newest version (56.1-1+b1). librdmacm1t64 set to manually installed. libuuid1 is already the newest version (2.41.2-4). perl-base is already the newest version (5.40.1-7). libkadm5clnt-mit12 is already the newest version (1.22.1-2). libkadm5clnt-mit12 set to manually installed. libibumad3 is already the newest version (56.1-1+b1). libibumad3 set to manually installed. libmumps-5.8 is already the newest version (5.8.1-2). libmumps-5.8 set to manually installed. libmpfr6 is already the newest version (4.2.2-2). libmpfr6 set to manually installed. libltdl-dev is already the newest version (2.5.4-7). libltdl-dev set to manually installed. man-db is already the newest version (2.13.1-1). man-db set to manually installed. libptscotcherr-dev is already the newest version (7.0.10-2). libptscotcherr-dev set to manually installed. patchelf is already the newest version (0.18.0-1.4). patchelf set to manually installed. libpetsc3.24-dev-examples is already the newest version (3.24.1+dfsg1-2). libpetsc3.24-dev-examples set to manually installed. libkeyutils1 is already the newest version (1.6.3-6). libkeyutils1 set to manually installed. libsz2 is already the newest version (1.1.4-2). libsz2 set to manually installed. libtasn1-6-dev is already the newest version (4.20.0-2). libtasn1-6-dev set to manually installed. libptscotch-64-dev is already the newest version (7.0.10-2). libptscotch-64-dev set to manually installed. libpetsc-real3.24 is already the newest version (3.24.1+dfsg1-2). libpetsc-real3.24 set to manually installed. libspqr4 is already the newest version (1:7.11.0+dfsg-2). libspqr4 set to manually installed. libjs-jquery-ui is already the newest version (1.13.2+dfsg-1). libjs-jquery-ui set to manually installed. mpi-default-bin is already the newest version (1.19). mpi-default-bin set to manually installed. cpp-15 is already the newest version (15.2.0-8). cpp-15 set to manually installed. libaec0 is already the newest version (1.1.4-2). libaec0 set to manually installed. gcc-riscv64-linux-gnu is already the newest version (4:15.2.0-4). gcc-riscv64-linux-gnu set to manually installed. libsqlite3-0 is already the newest version (3.46.1-8). libsqlite3-0 set to manually installed. libxau-dev is already the newest version (1:1.0.11-1). libxau-dev set to manually installed. libnghttp2-dev is already the newest version (1.64.0-1.1+b1). libnghttp2-dev set to manually installed. libpetsc-complex3.24-dev is already the newest version (3.24.1+dfsg1-2). libpetsc-complex3.24-dev set to manually installed. libyaml-dev is already the newest version (0.2.5-2). libyaml-dev set to manually installed. dh-fortran is already the newest version (0.57). dh-fortran set to manually installed. libgprofng0 is already the newest version (2.45-8). libgprofng0 set to manually installed. libscalapack-openmpi2.2 is already the newest version (2.2.2-2). libscalapack-openmpi2.2 set to manually installed. libscotcherr-7.0 is already the newest version (7.0.10-2). libscotcherr-7.0 set to manually installed. libarpack2t64 is already the newest version (3.9.1-6). libarpack2t64 set to manually installed. init-system-helpers is already the newest version (1.69). libzstd-dev is already the newest version (1.5.7+dfsg-2). libzstd-dev set to manually installed. libgmpxx4ldbl is already the newest version (2:6.3.0+dfsg-5). libgmpxx4ldbl set to manually installed. libevent-dev is already the newest version (2.1.12-stable-10+b1). libevent-dev set to manually installed. automake is already the newest version (1:1.18.1-3). automake set to manually installed. libpipeline1 is already the newest version (1.5.8-1). libpipeline1 set to manually installed. libptscotch-dev is already the newest version (7.0.10-2). libptscotch-dev set to manually installed. libblas64-3 is already the newest version (3.12.1-7). libblas64-3 set to manually installed. libaudit1 is already the newest version (1:4.1.2-1). gettext-base is already the newest version (0.23.2-1). gettext-base set to manually installed. libpetsc64-complex3.24-dev is already the newest version (3.24.1+dfsg1-2). libpetsc64-complex3.24-dev set to manually installed. po-debconf is already the newest version (1.0.21+nmu1). po-debconf set to manually installed. libstdc++-15-dev is already the newest version (15.2.0-8). libstdc++-15-dev set to manually installed. perl-modules-5.40 is already the newest version (5.40.1-7). perl-modules-5.40 set to manually installed. ocl-icd-libopencl1 is already the newest version (2.3.4-1). ocl-icd-libopencl1 set to manually installed. opencl-c-headers is already the newest version (3.0~2025.07.22-2). opencl-c-headers set to manually installed. libspex3 is already the newest version (1:7.11.0+dfsg-2). libspex3 set to manually installed. libcc1-0 is already the newest version (15.2.0-8). libcc1-0 set to manually installed. libfortran-jonquil-0 is already the newest version (0.3.0-3). libfortran-jonquil-0 set to manually installed. pkgconf is already the newest version (1.8.1-4). pkgconf set to manually installed. libsframe2 is already the newest version (2.45-8). libsframe2 set to manually installed. bsdextrautils is already the newest version (2.41.2-4). bsdextrautils set to manually installed. linux-libc-dev is already the newest version (6.17.8-1). linux-libc-dev set to manually installed. openmpi-bin is already the newest version (5.0.9-1). openmpi-bin set to manually installed. libreadline8t64 is already the newest version (8.3-3). libreadline8t64 set to manually installed. libdpkg-perl is already the newest version (1.22.21). libdpkg-perl set to manually installed. tar is already the newest version (1.35+dfsg-3.1). libarpack2-dev is already the newest version (3.9.1-6). libarpack2-dev set to manually installed. libisl23 is already the newest version (0.27-1). libisl23 set to manually installed. libattr1 is already the newest version (1:2.5.2-3). libpetsc64-real3.24-dev is already the newest version (3.24.1+dfsg1-2). libpetsc64-real3.24-dev set to manually installed. libldap2 is already the newest version (2.6.10+dfsg-1). libldap2 set to manually installed. libsasl2-2 is already the newest version (2.1.28+dfsg1-10). libsasl2-2 set to manually installed. libsemanage2 is already the newest version (3.9-1). libsemanage2 set to manually installed. binutils-common is already the newest version (2.45-8). binutils-common set to manually installed. libubsan1 is already the newest version (15.2.0-8). libubsan1 set to manually installed. libxdmcp-dev is already the newest version (1:1.1.5-1). libxdmcp-dev set to manually installed. libscotch-64-dev is already the newest version (7.0.10-2). libscotch-64-dev set to manually installed. libctf-nobfd0 is already the newest version (2.45-8). libctf-nobfd0 set to manually installed. autopoint is already the newest version (0.23.2-1). autopoint set to manually installed. mpi-default-dev is already the newest version (1.19). mpi-default-dev set to manually installed. libbrotli1 is already the newest version (1.1.0-2+b7). libbrotli1 set to manually installed. libopenblas64-pthread-dev is already the newest version (0.3.30+ds-3). libopenblas64-pthread-dev set to manually installed. libscotch-7.0c is already the newest version (7.0.10-2). libscotch-7.0c set to manually installed. libklu2 is already the newest version (1:7.11.0+dfsg-2). libklu2 set to manually installed. libhdf5-openmpi-hl-fortran-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-hl-fortran-310 set to manually installed. libhdf5-openmpi-hl-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-hl-310 set to manually installed. dh-python is already the newest version (6.20250414). dh-python set to manually installed. g++-riscv64-linux-gnu is already the newest version (4:15.2.0-4). g++-riscv64-linux-gnu set to manually installed. libgcc-s1 is already the newest version (15.2.0-8). liblapack3 is already the newest version (3.12.1-7). liblapack3 set to manually installed. gfortran-riscv64-linux-gnu is already the newest version (4:15.2.0-4). gfortran-riscv64-linux-gnu set to manually installed. libcholmod5 is already the newest version (1:7.11.0+dfsg-2). libcholmod5 set to manually installed. libfido2-1 is already the newest version (1.16.0-2). libfido2-1 set to manually installed. fortran-fpm is already the newest version (0.12.0-5). fortran-fpm set to manually installed. libopenmpi-dev is already the newest version (5.0.9-1). libopenmpi-dev set to manually installed. gzip is already the newest version (1.13-1). libgfortran5 is already the newest version (15.2.0-8). libgfortran5 set to manually installed. libscotch-64-7.0 is already the newest version (7.0.10-2). libscotch-64-7.0 set to manually installed. libgssapi-krb5-2 is already the newest version (1.22.1-2). libgssapi-krb5-2 set to manually installed. libc6 is already the newest version (2.41-12). libgdbm-compat4t64 is already the newest version (1.26-1). libgdbm-compat4t64 set to manually installed. libhdf5-mpi-dev is already the newest version (1.14.5+repack-4). libhdf5-mpi-dev set to manually installed. libgnutls28-dev is already the newest version (3.8.10-3). libgnutls28-dev set to manually installed. autotools-dev is already the newest version (20240727.1). autotools-dev set to manually installed. libptscotch-64i-7.0 is already the newest version (7.0.10-2). libptscotch-64i-7.0 set to manually installed. libsepol2 is already the newest version (3.9-2). libsepol2 set to manually installed. groff-base is already the newest version (1.23.0-9). groff-base set to manually installed. libcombblas2.0.0t64 is already the newest version (2.0.0-7). libcombblas2.0.0t64 set to manually installed. libncursesw6 is already the newest version (6.5+20251115-2). libncursesw6 set to manually installed. dh-autoreconf is already the newest version (21). dh-autoreconf set to manually installed. tzdata is already the newest version (2025b-5). tzdata set to manually installed. libmumps-64pord-5.8 is already the newest version (5.8.1-2). libmumps-64pord-5.8 set to manually installed. libarchive-zip-perl is already the newest version (1.68-1). libarchive-zip-perl set to manually installed. gfortran-15 is already the newest version (15.2.0-8). gfortran-15 set to manually installed. libhypre64-dev is already the newest version (2.33.0-3). libhypre64-dev set to manually installed. libpsl5t64 is already the newest version (0.21.2-1.1+b1). libpsl5t64 set to manually installed. libsuperlu-dist-dev is already the newest version (9.2.0+dfsg1-4). libsuperlu-dist-dev set to manually installed. libdebconfclient0 is already the newest version (0.281). libpetsc-real3.24-dev is already the newest version (3.24.1+dfsg1-2). libpetsc-real3.24-dev set to manually installed. libpsl-dev is already the newest version (0.21.2-1.1+b1). libpsl-dev set to manually installed. patch is already the newest version (2.8-2). patch set to manually installed. netbase is already the newest version (6.5). netbase set to manually installed. nettle-dev is already the newest version (3.10.2-1). nettle-dev set to manually installed. findutils is already the newest version (4.10.0-3). cpp-15-riscv64-linux-gnu is already the newest version (15.2.0-8). cpp-15-riscv64-linux-gnu set to manually installed. debhelper is already the newest version (13.28). debhelper set to manually installed. bzip2 is already the newest version (1.0.8-6). bzip2 set to manually installed. chrpath is already the newest version (0.18-1). chrpath set to manually installed. gfortran-15-riscv64-linux-gnu is already the newest version (15.2.0-8). gfortran-15-riscv64-linux-gnu set to manually installed. libnuma1 is already the newest version (2.0.19-1). libnuma1 set to manually installed. libnettle8t64 is already the newest version (3.10.2-1). libnettle8t64 set to manually installed. libtsan2 is already the newest version (15.2.0-8). libtsan2 set to manually installed. libparpack2-dev is already the newest version (3.9.1-6). libparpack2-dev set to manually installed. file is already the newest version (1:5.46-5). file set to manually installed. libptscotcherr-7.0 is already the newest version (7.0.10-2). libptscotcherr-7.0 set to manually installed. comerr-dev is already the newest version (2.1-1.47.2-3+b3). comerr-dev set to manually installed. libsuitesparse-dev is already the newest version (1:7.11.0+dfsg-2). libsuitesparse-dev set to manually installed. libgcc-15-dev is already the newest version (15.2.0-8). libgcc-15-dev set to manually installed. libccolamd3 is already the newest version (1:7.11.0+dfsg-2). libccolamd3 set to manually installed. libjpeg62-turbo-dev is already the newest version (1:2.1.5-4). libjpeg62-turbo-dev set to manually installed. librtmp-dev is already the newest version (2.4+20151223.gitfa8646d.1-3). librtmp-dev set to manually installed. bash is already the newest version (5.3-1). libmagic1t64 is already the newest version (1:5.46-5). libmagic1t64 set to manually installed. libngtcp2-crypto-ossl-dev is already the newest version (1.16.0-1). libngtcp2-crypto-ossl-dev set to manually installed. coreutils is already the newest version (9.7-3). libaec-dev is already the newest version (1.1.4-2). libaec-dev set to manually installed. libunbound8 is already the newest version (1.24.1-2). libunbound8 set to manually installed. g++-15 is already the newest version (15.2.0-8). g++-15 set to manually installed. libfftw3-single3 is already the newest version (3.3.10-2+b1). libfftw3-single3 set to manually installed. libhdf5-openmpi-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-310 set to manually installed. openssl-provider-legacy is already the newest version (3.5.4-1). gcc-15-riscv64-linux-gnu is already the newest version (15.2.0-8). gcc-15-riscv64-linux-gnu set to manually installed. libhypre-dev is already the newest version (2.33.0-3). libhypre-dev set to manually installed. ocl-icd-opencl-dev is already the newest version (2.3.4-1). ocl-icd-opencl-dev set to manually installed. libfftw3-mpi3 is already the newest version (3.3.10-2+b1). libfftw3-mpi3 set to manually installed. libhdf5-openmpi-cpp-310 is already the newest version (1.14.5+repack-4). libhdf5-openmpi-cpp-310 set to manually installed. libpciaccess0 is already the newest version (0.17-3+b3). libpciaccess0 set to manually installed. libpcre2-8-0 is already the newest version (10.46-1). libgomp1 is already the newest version (15.2.0-8). libgomp1 set to manually installed. libgmp-dev is already the newest version (2:6.3.0+dfsg-5). libgmp-dev set to manually installed. libjs-jquery is already the newest version (3.7.1+dfsg+~3.5.33-1). libjs-jquery set to manually installed. libnl-route-3-200 is already the newest version (3.11.0-2). libnl-route-3-200 set to manually installed. libc-dev-bin is already the newest version (2.41-12). libc-dev-bin set to manually installed. openssh-client is already the newest version (1:10.2p1-2). openssh-client set to manually installed. libscotch-dev is already the newest version (7.0.10-2). libscotch-dev set to manually installed. libyaml-0-2 is already the newest version (0.2.5-2). libyaml-0-2 set to manually installed. libnghttp3-9 is already the newest version (1.12.0-1). libnghttp3-9 set to manually installed. libssh2-1t64 is already the newest version (1.11.1-1). libssh2-1t64 set to manually installed. debconf is already the newest version (1.5.91). cpp-riscv64-linux-gnu is already the newest version (4:15.2.0-4). cpp-riscv64-linux-gnu set to manually installed. libudev1 is already the newest version (259~rc1-1). hostname is already the newest version (3.25). libk5crypto3 is already the newest version (1.22.1-2). libk5crypto3 set to manually installed. libidn2-dev is already the newest version (2.3.8-4). libidn2-dev set to manually installed. x11proto-dev is already the newest version (2024.1-1). x11proto-dev set to manually installed. login.defs is already the newest version (1:4.18.0-2). login.defs set to manually installed. libfftw3-bin is already the newest version (3.3.10-2+b1). libfftw3-bin set to manually installed. 0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. I: running --customize-hook in shell: sh -c 'chroot "$1" dpkg -r debootsnap-dummy' exec /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re (Reading database ... 34789 files and directories currently installed.) Removing debootsnap-dummy (1.0) ... I: running --customize-hook in shell: sh -c 'chroot "$1" dpkg-query --showformat '${binary:Package}=${Version}\n' --show > "$1/pkglist"' exec /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re I: running special hook: download /pkglist ./pkglist I: running --customize-hook in shell: sh -c 'rm "$1/pkglist"' exec /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re I: running special hook: upload sources.list /etc/apt/sources.list I: waiting for background processes to finish... I: cleaning package lists and apt cache... I: skipping cleanup/reproducible as requested I: creating tarball... I: done I: removing tempdir /srv/rebuilderd/tmp/mmdebstrap.QrxY6BU_Re... I: success in 190.2900 seconds Downloading dependency 370 of 393: ocl-icd-opencl-dev:riscv64=2.3.4-1 Downloading dependency 371 of 393: libfftw3-mpi3:riscv64=3.3.10-2+b1 Downloading dependency 372 of 393: libhdf5-openmpi-cpp-310:riscv64=1.14.5+repack-4 Downloading dependency 373 of 393: libpciaccess0:riscv64=0.17-3+b3 Downloading dependency 374 of 393: libpcre2-8-0:riscv64=10.46-1 Downloading dependency 375 of 393: libgomp1:riscv64=15.2.0-8 Downloading dependency 376 of 393: libgmp-dev:riscv64=2:6.3.0+dfsg-5 Downloading dependency 377 of 393: libjs-jquery:riscv64=3.7.1+dfsg+~3.5.33-1 Downloading dependency 378 of 393: libnl-route-3-200:riscv64=3.11.0-2 Downloading dependency 379 of 393: libc-dev-bin:riscv64=2.41-12 Downloading dependency 380 of 393: openssh-client:riscv64=1:10.2p1-2 Downloading dependency 381 of 393: libscotch-dev:riscv64=7.0.10-2 Downloading dependency 382 of 393: libyaml-0-2:riscv64=0.2.5-2 Downloading dependency 383 of 393: libnghttp3-9:riscv64=1.12.0-1 Downloading dependency 384 of 393: libssh2-1t64:riscv64=1.11.1-1 Downloading dependency 385 of 393: debconf:riscv64=1.5.91 Downloading dependency 386 of 393: cpp-riscv64-linux-gnu:riscv64=4:15.2.0-4 Downloading dependency 387 of 393: libudev1:riscv64=259~rc1-1 Downloading dependency 388 of 393: hostname:riscv64=3.25 Downloading dependency 389 of 393: libk5crypto3:riscv64=1.22.1-2 Downloading dependency 390 of 393: libidn2-dev:riscv64=2.3.8-4 Downloading dependency 391 of 393: x11proto-dev:riscv64=2024.1-1 Downloading dependency 392 of 393: login.defs:riscv64=1:4.18.0-2 Downloading dependency 393 of 393: libfftw3-bin:riscv64=3.3.10-2+b1 env --chdir=/srv/rebuilderd/tmp/rebuilderd1Mmvtg/out DEB_BUILD_OPTIONS=parallel=4 LANG=C.UTF-8 LC_COLLATE=C.UTF-8 LC_CTYPE=C.UTF-8 SOURCE_DATE_EPOCH=1763813105 SBUILD_CONFIG=/srv/rebuilderd/tmp/debrebuildvMXr_h/debrebuild.sbuildrc.6p0RlS1J1u_g sbuild --build=riscv64 --host=riscv64 --no-source --arch-any --no-arch-all --chroot=/srv/rebuilderd/tmp/debrebuildvMXr_h/debrebuild.tar.7WNIdvxHBf6v --chroot-mode=unshare --dist=unstable --no-run-lintian --no-run-piuparts --no-run-autopkgtest --no-apt-update --no-apt-upgrade --no-apt-distupgrade --verbose --nolog --bd-uninstallable-explainer= --build-path=/build/reproducible-path --dsc-dir=slepc-3.24.1+dfsg1 /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1.dsc I: consider moving your ~/.sbuildrc to /srv/rebuilderd/.config/sbuild/config.pl The Debian buildds switched to the "unshare" backend and sbuild will default to it in the future. To start using "unshare" add this to your `~/.config/sbuild/config.pl`: $chroot_mode = "unshare"; If you want to keep the old "schroot" mode even in the future, add the following to your `~/.config/sbuild/config.pl`: $chroot_mode = "schroot"; $schroot = "schroot"; sbuild (Debian sbuild) 0.89.3+deb13u1 (16 August 2025) on localhost +==============================================================================+ | slepc 3.24.1+dfsg1-1 (riscv64) Mon, 05 Jan 2026 10:33:30 +0000 | +==============================================================================+ Package: slepc Version: 3.24.1+dfsg1-1 Source Version: 3.24.1+dfsg1-1 Distribution: unstable Machine Architecture: riscv64 Host Architecture: riscv64 Build Architecture: riscv64 Build Type: any I: No tarballs found in /srv/rebuilderd/.cache/sbuild I: Unpacking /srv/rebuilderd/tmp/debrebuildvMXr_h/debrebuild.tar.7WNIdvxHBf6v to /srv/rebuilderd/tmp/tmp.sbuild.0QlxZp6U1n... I: Setting up the chroot... I: Creating chroot session... I: Setting up log color... I: Setting up apt archive... +------------------------------------------------------------------------------+ | Fetch source files Mon, 05 Jan 2026 10:34:01 +0000 | +------------------------------------------------------------------------------+ Local sources ------------- /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1.dsc exists in /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs; copying to chroot +------------------------------------------------------------------------------+ | Install package build dependencies Mon, 05 Jan 2026 10:34:06 +0000 | +------------------------------------------------------------------------------+ Setup apt archive ----------------- Merged Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran-mod, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev, build-essential Filtered Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran-mod, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev, build-essential dpkg-deb: building package 'sbuild-build-depends-main-dummy' in '/build/reproducible-path/resolver-in492I/apt_archive/sbuild-build-depends-main-dummy.deb'. Install main build dependencies (apt-based resolver) ---------------------------------------------------- Installing build dependencies +------------------------------------------------------------------------------+ | Check architectures Mon, 05 Jan 2026 10:34:15 +0000 | +------------------------------------------------------------------------------+ Arch check ok (riscv64 included in any all) +------------------------------------------------------------------------------+ | Build environment Mon, 05 Jan 2026 10:34:16 +0000 | +------------------------------------------------------------------------------+ Kernel: Linux 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 (riscv64) Toolchain package versions: binutils_2.45-8 dpkg-dev_1.22.21 g++-15_15.2.0-8 gcc-15_15.2.0-8 libc6-dev_2.41-12 libstdc++-15-dev_15.2.0-8 libstdc++6_15.2.0-8 linux-libc-dev_6.17.8-1 Package versions: adduser_3.153 autoconf_2.72-3.1 automake_1:1.18.1-3 autopoint_0.23.2-1 autotools-dev_20240727.1 base-files_14 base-passwd_3.6.8 bash_5.3-1 binutils_2.45-8 binutils-common_2.45-8 binutils-riscv64-linux-gnu_2.45-8 bsdextrautils_2.41.2-4 build-essential_12.12 bzip2_1.0.8-6 chrpath_0.18-1 comerr-dev_2.1-1.47.2-3+b3 coreutils_9.7-3 cpp_4:15.2.0-4 cpp-15_15.2.0-8 cpp-15-riscv64-linux-gnu_15.2.0-8 cpp-riscv64-linux-gnu_4:15.2.0-4 dash_0.5.12-12 debconf_1.5.91 debhelper_13.28 debianutils_5.23.2 dh-autoreconf_21 dh-fortran_0.57 dh-fortran-mod_0.57 dh-python_6.20250414 dh-strip-nondeterminism_1.15.0-1 diffutils_1:3.12-1 dpkg_1.22.21 dpkg-dev_1.22.21 dwz_0.16-2 file_1:5.46-5 findutils_4.10.0-3 fonts-mathjax_2.7.9+dfsg-1 fortran-fpm_0.12.0-5 g++_4:15.2.0-4 g++-15_15.2.0-8 g++-15-riscv64-linux-gnu_15.2.0-8 g++-riscv64-linux-gnu_4:15.2.0-4 gcc_4:15.2.0-4 gcc-15_15.2.0-8 gcc-15-base_15.2.0-8 gcc-15-riscv64-linux-gnu_15.2.0-8 gcc-riscv64-linux-gnu_4:15.2.0-4 gettext_0.23.2-1 gettext-base_0.23.2-1 gfortran_4:15.2.0-4 gfortran-15_15.2.0-8 gfortran-15-riscv64-linux-gnu_15.2.0-8 gfortran-riscv64-linux-gnu_4:15.2.0-4 grep_3.12-1 groff-base_1.23.0-9 gzip_1.13-1 hostname_3.25 ibverbs-providers_56.1-1+b1 init-system-helpers_1.69 intltool-debian_0.35.0+20060710.6 krb5-multidev_1.22.1-2 libacl1_2.3.2-2+b1 libaec-dev_1.1.4-2 libaec0_1.1.4-2 libamd3_1:7.11.0+dfsg-2 libarchive-zip-perl_1.68-1 libarpack2-dev_3.9.1-6 libarpack2t64_3.9.1-6 libasan8_15.2.0-8 libatomic1_15.2.0-8 libattr1_1:2.5.2-3 libaudit-common_1:4.1.2-1 libaudit1_1:4.1.2-1 libbinutils_2.45-8 libblas-dev_3.12.1-7 libblas3_3.12.1-7 libblas64-3_3.12.1-7 libblkid1_2.41.2-4 libbrotli-dev_1.1.0-2+b7 libbrotli1_1.1.0-2+b7 libbsd0_0.12.2-2 libbtf2_1:7.11.0+dfsg-2 libbz2-1.0_1.0.8-6 libc-bin_2.41-12 libc-dev-bin_2.41-12 libc6_2.41-12 libc6-dev_2.41-12 libcamd3_1:7.11.0+dfsg-2 libcap-ng0_0.8.5-4+b1 libcap2_1:2.75-10+b1 libcbor0.10_0.10.2-2.1 libcc1-0_15.2.0-8 libccolamd3_1:7.11.0+dfsg-2 libcholmod5_1:7.11.0+dfsg-2 libcolamd3_1:7.11.0+dfsg-2 libcom-err2_1.47.2-3+b3 libcombblas2.0.0t64_2.0.0-7 libcrypt-dev_1:4.5.1-1 libcrypt1_1:4.5.1-1 libctf-nobfd0_2.45-8 libctf0_2.45-8 libcurl4-openssl-dev_8.17.0-2 libcurl4t64_8.17.0-2 libcxsparse4_1:7.11.0+dfsg-2 libdb5.3t64_5.3.28+dfsg2-10 libdebconfclient0_0.281 libdebhelper-perl_13.28 libdpkg-perl_1.22.21 libedit2_3.1-20250104-1 libelf1t64_0.194-1 libevent-2.1-7t64_2.1.12-stable-10+b1 libevent-core-2.1-7t64_2.1.12-stable-10+b1 libevent-dev_2.1.12-stable-10+b1 libevent-extra-2.1-7t64_2.1.12-stable-10+b1 libevent-openssl-2.1-7t64_2.1.12-stable-10+b1 libevent-pthreads-2.1-7t64_2.1.12-stable-10+b1 libexpat1_2.7.3-1 libfabric1_2.1.0-1.1 libffi8_3.5.2-2 libfftw3-bin_3.3.10-2+b1 libfftw3-dev_3.3.10-2+b1 libfftw3-double3_3.3.10-2+b1 libfftw3-long3_3.3.10-2+b1 libfftw3-mpi-dev_3.3.10-2+b1 libfftw3-mpi3_3.3.10-2+b1 libfftw3-single3_3.3.10-2+b1 libfido2-1_1.16.0-2 libfile-libmagic-perl_1.23-2+b2 libfile-stripnondeterminism-perl_1.15.0-1 libfortran-jonquil-0_0.3.0-3 libfortran-toml-0_0.4.3-1 libfuse3-4_3.17.4-1 libgcc-15-dev_15.2.0-8 libgcc-s1_15.2.0-8 libgdbm-compat4t64_1.26-1 libgdbm6t64_1.26-1 libgfortran-15-dev_15.2.0-8 libgfortran5_15.2.0-8 libgmp-dev_2:6.3.0+dfsg-5 libgmp10_2:6.3.0+dfsg-5 libgmpxx4ldbl_2:6.3.0+dfsg-5 libgnutls-dane0t64_3.8.10-3 libgnutls-openssl27t64_3.8.10-3 libgnutls28-dev_3.8.10-3 libgnutls30t64_3.8.10-3 libgomp1_15.2.0-8 libgprofng0_2.45-8 libgssapi-krb5-2_1.22.1-2 libgssrpc4t64_1.22.1-2 libhdf5-mpi-dev_1.14.5+repack-4 libhdf5-openmpi-310_1.14.5+repack-4 libhdf5-openmpi-cpp-310_1.14.5+repack-4 libhdf5-openmpi-dev_1.14.5+repack-4 libhdf5-openmpi-fortran-310_1.14.5+repack-4 libhdf5-openmpi-hl-310_1.14.5+repack-4 libhdf5-openmpi-hl-cpp-310_1.14.5+repack-4 libhdf5-openmpi-hl-fortran-310_1.14.5+repack-4 libhogweed6t64_3.10.2-1 libhwloc-dev_2.12.2-1 libhwloc-plugins_2.12.2-1 libhwloc15_2.12.2-1 libhypre-2.33.0_2.33.0-3 libhypre-dev_2.33.0-3 libhypre64-2.33.0_2.33.0-3 libhypre64-dev_2.33.0-3 libibmad5_56.1-1+b1 libibumad3_56.1-1+b1 libibverbs-dev_56.1-1+b1 libibverbs1_56.1-1+b1 libidn2-0_2.3.8-4 libidn2-dev_2.3.8-4 libisl23_0.27-1 libitm1_15.2.0-8 libjansson4_2.14-2+b3 libjpeg-dev_1:2.1.5-4 libjpeg62-turbo_1:2.1.5-4 libjpeg62-turbo-dev_1:2.1.5-4 libjs-jquery_3.7.1+dfsg+~3.5.33-1 libjs-jquery-ui_1.13.2+dfsg-1 libjs-mathjax_2.7.9+dfsg-1 libk5crypto3_1.22.1-2 libkadm5clnt-mit12_1.22.1-2 libkadm5srv-mit12_1.22.1-2 libkdb5-10t64_1.22.1-2 libkeyutils1_1.6.3-6 libklu2_1:7.11.0+dfsg-2 libkrb5-3_1.22.1-2 libkrb5-dev_1.22.1-2 libkrb5support0_1.22.1-2 liblapack-dev_3.12.1-7 liblapack3_3.12.1-7 liblapack64-3_3.12.1-7 libldap-dev_2.6.10+dfsg-1 libldap2_2.6.10+dfsg-1 libldl3_1:7.11.0+dfsg-2 liblsan0_15.2.0-8 libltdl-dev_2.5.4-7 libltdl7_2.5.4-7 liblzma5_5.8.1-2 libmagic-mgc_1:5.46-5 libmagic1t64_1:5.46-5 libmd0_1.1.0-2+b1 libmetis5_5.1.0.dfsg-8 libmount1_2.41.2-4 libmpc3_1.3.1-2 libmpfr6_4.2.2-2 libmumps-5.8_5.8.1-2 libmumps-64pord-5.8_5.8.1-2 libmumps-dev_5.8.1-2 libmumps-headers-dev_5.8.1-2 libmumps64-dev_5.8.1-2 libmunge2_0.5.16-1 libncursesw6_6.5+20251115-2 libnettle8t64_3.10.2-1 libnghttp2-14_1.64.0-1.1+b1 libnghttp2-dev_1.64.0-1.1+b1 libnghttp3-9_1.12.0-1 libnghttp3-dev_1.12.0-1 libngtcp2-16_1.16.0-1 libngtcp2-crypto-ossl-dev_1.16.0-1 libngtcp2-crypto-ossl0_1.16.0-1 libngtcp2-dev_1.16.0-1 libnl-3-200_3.11.0-2 libnl-3-dev_3.11.0-2 libnl-route-3-200_3.11.0-2 libnl-route-3-dev_3.11.0-2 libnuma-dev_2.0.19-1 libnuma1_2.0.19-1 libopenblas64-0_0.3.30+ds-3 libopenblas64-0-pthread_0.3.30+ds-3 libopenblas64-dev_0.3.30+ds-3 libopenblas64-pthread-dev_0.3.30+ds-3 libopenmpi-dev_5.0.9-1 libopenmpi40_5.0.9-1 libp11-kit-dev_0.25.10-1 libp11-kit0_0.25.10-1 libpam-modules_1.7.0-5 libpam-modules-bin_1.7.0-5 libpam-runtime_1.7.0-5 libpam0g_1.7.0-5 libparpack2-dev_3.9.1-6 libparpack2t64_3.9.1-6 libparu1_1:7.11.0+dfsg-2 libpciaccess0_0.17-3+b3 libpcre2-8-0_10.46-1 libperl5.40_5.40.1-7 libpetsc-complex3.24_3.24.1+dfsg1-2 libpetsc-complex3.24-dev_3.24.1+dfsg1-2 libpetsc-real3.24_3.24.1+dfsg1-2 libpetsc-real3.24-dev_3.24.1+dfsg1-2 libpetsc3.24-dev-common_3.24.1+dfsg1-2 libpetsc3.24-dev-examples_3.24.1+dfsg1-2 libpetsc64-complex3.24_3.24.1+dfsg1-2 libpetsc64-complex3.24-dev_3.24.1+dfsg1-2 libpetsc64-real3.24_3.24.1+dfsg1-2 libpetsc64-real3.24-dev_3.24.1+dfsg1-2 libpipeline1_1.5.8-1 libpkgconf3_1.8.1-4 libpmix2t64_6.0.0+really5.0.9-2 libpsl-dev_0.21.2-1.1+b1 libpsl5t64_0.21.2-1.1+b1 libptscotch-64-7.0_7.0.10-2 libptscotch-64-dev_7.0.10-2 libptscotch-64i-7.0_7.0.10-2 libptscotch-64i-dev_7.0.10-2 libptscotch-7.0c_7.0.10-2 libptscotch-dev_7.0.10-2 libptscotcherr-7.0_7.0.10-2 libptscotcherr-dev_7.0.10-2 libpython3-stdlib_3.13.7-1 libpython3.13-minimal_3.13.9-1 libpython3.13-stdlib_3.13.9-1 librbio4_1:7.11.0+dfsg-2 librdmacm1t64_56.1-1+b1 libreadline8t64_8.3-3 librtmp-dev_2.4+20151223.gitfa8646d.1-3 librtmp1_2.4+20151223.gitfa8646d.1-3 libsasl2-2_2.1.28+dfsg1-10 libsasl2-modules-db_2.1.28+dfsg1-10 libscalapack-mpi-dev_2.2.2-2 libscalapack-openmpi-dev_2.2.2-2 libscalapack-openmpi2.2_2.2.2-2 libscotch-64-7.0_7.0.10-2 libscotch-64-dev_7.0.10-2 libscotch-64i-7.0_7.0.10-2 libscotch-64i-dev_7.0.10-2 libscotch-7.0c_7.0.10-2 libscotch-dev_7.0.10-2 libscotcherr-7.0_7.0.10-2 libscotcherr-dev_7.0.10-2 libselinux1_3.9-2 libsemanage-common_3.9-1 libsemanage2_3.9-1 libsepol2_3.9-2 libsframe2_2.45-8 libsmartcols1_2.41.2-4 libspex3_1:7.11.0+dfsg-2 libspqr4_1:7.11.0+dfsg-2 libsqlite3-0_3.46.1-8 libssh2-1-dev_1.11.1-1 libssh2-1t64_1.11.1-1 libssl-dev_3.5.4-1 libssl3t64_3.5.4-1 libstdc++-15-dev_15.2.0-8 libstdc++6_15.2.0-8 libsuitesparse-dev_1:7.11.0+dfsg-2 libsuitesparse-mongoose3_1:7.11.0+dfsg-2 libsuitesparseconfig7_1:7.11.0+dfsg-2 libsuperlu-dev_7.0.1+dfsg1-2 libsuperlu-dist-dev_9.2.0+dfsg1-4 libsuperlu-dist9_9.2.0+dfsg1-4 libsuperlu7_7.0.1+dfsg1-2 libsystemd0_259~rc1-1 libsz2_1.1.4-2 libtasn1-6_4.20.0-2 libtasn1-6-dev_4.20.0-2 libtinfo6_6.5+20251115-2 libtool_2.5.4-7 libtsan2_15.2.0-8 libubsan1_15.2.0-8 libuchardet0_0.0.8-2 libucx0_1.19.0+ds-1+b1 libudev1_259~rc1-1 libumfpack6_1:7.11.0+dfsg-2 libunbound8_1.24.1-2 libunistring5_1.3-2 libuuid1_2.41.2-4 libx11-6_2:1.8.12-1 libx11-data_2:1.8.12-1 libx11-dev_2:1.8.12-1 libxau-dev_1:1.0.11-1 libxau6_1:1.0.11-1 libxcb1_1.17.0-2+b1 libxcb1-dev_1.17.0-2+b1 libxdmcp-dev_1:1.1.5-1 libxdmcp6_1:1.1.5-1 libxext6_2:1.3.4-1+b3 libxml2-16_2.15.1+dfsg-0.4 libxnvctrl0_535.171.04-1+b2 libyaml-0-2_0.2.5-2 libyaml-dev_0.2.5-2 libzstd-dev_1.5.7+dfsg-2 libzstd1_1.5.7+dfsg-2 linux-libc-dev_6.17.8-1 login.defs_1:4.18.0-2 m4_1.4.20-2 make_4.4.1-3 man-db_2.13.1-1 mawk_1.3.4.20250131-1 media-types_14.0.0 mpi-default-bin_1.19 mpi-default-dev_1.19 ncurses-base_6.5+20250216-2 ncurses-bin_6.5+20250216-2 netbase_6.5 nettle-dev_3.10.2-1 ocl-icd-libopencl1_2.3.4-1 ocl-icd-opencl-dev_2.3.4-1 opencl-c-headers_3.0~2025.07.22-2 opencl-clhpp-headers_3.0~2025.07.22-1 openmpi-bin_5.0.9-1 openmpi-common_5.0.9-1 openssh-client_1:10.2p1-2 openssl-provider-legacy_3.5.4-1 passwd_1:4.18.0-2 patch_2.8-2 patchelf_0.18.0-1.4 perl_5.40.1-7 perl-base_5.40.1-7 perl-modules-5.40_5.40.1-7 pkgconf_1.8.1-4 pkgconf-bin_1.8.1-4 po-debconf_1.0.21+nmu1 python3_3.13.7-1 python3-click_8.2.0+0.really.8.1.8-1 python3-magic_2:0.4.27-3 python3-minimal_3.13.7-1 python3.13_3.13.9-1 python3.13-minimal_3.13.9-1 readline-common_8.3-3 rpcsvc-proto_1.4.3-1+b2 sed_4.9-2 sensible-utils_0.0.26 sysvinit-utils_3.15-6 tar_1.35+dfsg-3.1 tzdata_2025b-5 util-linux_2.41.2-4 x11proto-dev_2024.1-1 xorg-sgml-doctools_1:1.11-1.1 xtrans-dev_1.6.0-1 xz-utils_5.8.1-2 zlib1g_1:1.3.dfsg+really1.3.1-1+b1 zlib1g-dev_1:1.3.dfsg+really1.3.1-1+b1 +------------------------------------------------------------------------------+ | Build Mon, 05 Jan 2026 10:34:16 +0000 | +------------------------------------------------------------------------------+ Unpack source ------------- -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 3.0 (quilt) Source: slepc Binary: slepc-dev, libslepc-real-dev, libslepc-complex-dev, libslepc-real3.24-dev, libslepc3.24-dev-examples, libslepc-real3.24, slepc3.24-doc, libslepc-complex3.24-dev, libslepc-complex3.24, slepc64-dev, libslepc64-real-dev, libslepc64-complex-dev, libslepc64-real3.24-dev, libslepc64-real3.24, libslepc64-complex3.24-dev, libslepc64-complex3.24 Architecture: any all Version: 3.24.1+dfsg1-1 Maintainer: Debian Science Maintainers Uploaders: "Adam C. Powell, IV" , Drew Parsons , Francesco Ballarin Homepage: http://slepc.upv.es/ Standards-Version: 4.7.2 Vcs-Browser: https://salsa.debian.org/science-team/slepc Vcs-Git: https://salsa.debian.org/science-team/slepc.git Testsuite: autopkgtest Testsuite-Triggers: @builddeps@ Build-Depends: dpkg-dev (>= 1.22.5), debhelper-compat (= 13), python3, pkgconf, dh-python, gfortran, chrpath, dh-fortran-mod, libpetsc-real3.24-dev, libpetsc-complex3.24-dev, libpetsc3.24-dev-examples, libpetsc64-real3.24-dev, libpetsc64-complex3.24-dev, libarpack2-dev, libparpack2-dev Package-List: libslepc-complex-dev deb libdevel optional arch=any libslepc-complex3.24 deb libs optional arch=any libslepc-complex3.24-dev deb libdevel optional arch=any libslepc-real-dev deb libdevel optional arch=any libslepc-real3.24 deb libs optional arch=any libslepc-real3.24-dev deb libdevel optional arch=any libslepc3.24-dev-examples deb libdevel optional arch=all libslepc64-complex-dev deb libdevel optional arch=any libslepc64-complex3.24 deb libs optional arch=any libslepc64-complex3.24-dev deb libdevel optional arch=any libslepc64-real-dev deb libdevel optional arch=any libslepc64-real3.24 deb libs optional arch=any libslepc64-real3.24-dev deb libdevel optional arch=any slepc-dev deb libdevel optional arch=any slepc3.24-doc deb doc optional arch=all slepc64-dev deb libdevel optional arch=any Checksums-Sha1: 159c331f6e46ff4403815c2d93b8a6bb92d28a40 23597012 slepc_3.24.1+dfsg1.orig.tar.xz 6bb899609d14665df46833d7e47d72c139158761 21360 slepc_3.24.1+dfsg1-1.debian.tar.xz Checksums-Sha256: ba48811ad927c83ec7ce9eb52c4fcfd97bb055ec5b269ac1b452a677f67d1a0a 23597012 slepc_3.24.1+dfsg1.orig.tar.xz d169298a7f30dc194d764e4a0ae11913bd1472a65df77f80f3cf0599cc067b64 21360 slepc_3.24.1+dfsg1-1.debian.tar.xz Files: b0f6e219f3f4c62271b38b76f62429db 23597012 slepc_3.24.1+dfsg1.orig.tar.xz 0da0e9d693f9435e93a0841afd380678 21360 slepc_3.24.1+dfsg1-1.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQIzBAEBCgAdFiEEI8mpPlhYGekSbQo2Vz7x5L1aAfoFAmkht3oACgkQVz7x5L1a AfpWbQ/9Fb7P6RD2NeJFBkxWg0/V7fUSQMV0qJ8hBG8tMlHC5bSrRQ57KaM2KsDe /Y1TgRyKEWkwy4hlCvDPSO9o2V2eMkF4VFFJoa3S6Bf+TDhjMXY7qBis8PzR3Npz Ylg/kHzh0a7+Ar8otZs8lUMfkai8rfQJJLHMWL5ssiQzrKVx2sqLhJTXIp9k6469 A4SqszmW7UlYs63PS8VS4LfKXUhBAQ9h9bC3cAa3eAdHOUjUPob3n0/D7Gs1w0lW 2xDQy4/SWwK+zAr52tNfPb46d+sOkMuD/QSYclh7F9L6b7SbufmJec34Ex3GTTHp V+3BmcCV8KVrF9f+yHBZCOgqkZa91upGyVrU8Qu1zom/NcL0nlDCWl17jFkeHFDo QCRxoPgQOgucuneot01jLaVPXiR/rleSElEN0m7TLZaMZ+e8KubEt1c8gFfbi8MV gNaCS31NcsVURrv91u5/IjfJzIeExPUUXG/VZqFspEHoQsAXzmG+98b3O+FYN6fZ bbo/C0LZ7B/yuomjyYT/xr11dLik4MWy5jcjJIFruCwc/GDQKOdWiaiROFtFvHSZ xSqzecgrJ4ifyo1fXIyWaH094m8/TszeAs/fGqcL7mVyZQfn1XmcrUad9KcbO3LD ppUYcL+119Xn22cZl/eQZhXkMPJik3SlwNFwiIJyb5divwhhNFI= =ghlH -----END PGP SIGNATURE----- dpkg-source: warning: cannot verify inline signature for ./slepc_3.24.1+dfsg1-1.dsc: unsupported subcommand dpkg-source: info: extracting slepc in /build/reproducible-path/slepc-3.24.1+dfsg1 dpkg-source: info: unpacking slepc_3.24.1+dfsg1.orig.tar.xz dpkg-source: info: unpacking slepc_3.24.1+dfsg1-1.debian.tar.xz dpkg-source: info: using patch list from debian/patches/series dpkg-source: info: applying double_colon_patch dpkg-source: info: applying configure_python3.patch dpkg-source: info: applying build_suffix.patch dpkg-source: info: applying skip_test7f.patch dpkg-source: info: applying test_nox.patch dpkg-source: info: applying ignore_git_hidden_folder_in_config_slepc_py.patch Check disk space ---------------- Sufficient free space for build User Environment ---------------- APT_CONFIG=/var/lib/sbuild/apt.conf DEB_BUILD_OPTIONS=parallel=4 HOME=/sbuild-nonexistent LANG=C.UTF-8 LC_ALL=C.UTF-8 LC_COLLATE=C.UTF-8 LC_CTYPE=C.UTF-8 LOGNAME=sbuild PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games SHELL=/bin/sh SOURCE_DATE_EPOCH=1763813105 USER=sbuild dpkg-buildpackage ----------------- Command: dpkg-buildpackage --sanitize-env -us -uc -B dpkg-buildpackage: info: source package slepc dpkg-buildpackage: info: source version 3.24.1+dfsg1-1 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Drew Parsons dpkg-source --before-build . dpkg-buildpackage: info: host architecture riscv64 debian/rules clean dh clean --with python3,fortran_mod debian/rules override_dh_auto_clean make[1]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' if [ -d installed-arch-linux2-c-opt ]; then \ dh_auto_clean -plibslepc-real3.24-dev -pslepc3.24-doc -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex; \ fi if [ -d installed-arch-linux2-c-opt-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex; \ fi make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' debian/rules override_dh_clean make[1]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' if [ -d installed-arch-linux2-c-opt ]; then \ dh_auto_clean -plibslepc-real3.24-dev -pslepc3.24-doc -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex; \ fi if [ -d installed-arch-linux2-c-opt-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real; \ fi if [ -d installed-arch-linux2-c-opt-complex-64 -a -f /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib/petsc/conf/petscrules ]; then \ dh_auto_clean -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex; \ fi dh_clean rm -rf installed-* rm -f lib/slepc/conf/slepcvariables rm -f make.log configure.log find config -name *.pyc | xargs rm -f rm -rf installed-arch-linux2-c-opt installed-arch-linux2-c-opt-complex installed-arch-linux2-c-opt-64 installed-arch-linux2-c-opt-complex-64 make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' debian/rules binary-arch dh binary-arch --with python3,fortran_mod dh_update_autotools_config -a dh_autoreconf -a debian/rules override_dh_auto_configure make[1]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' if PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real \ ./configure --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real \ --with-arpack=1 ; then \ : ; \ else \ err=$?; \ echo "real configure failed with exit value $err"; \ echo "===== show real configure.log ====="; \ cat installed-arch-linux2-c-opt/lib/slepc/conf/configure.log; \ echo "===== end real configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Checking ARPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real Prefix install with double precision real numbers SCALAPACK from SCALAPACK linked by PETSc ARPACK library flags: -lparpack -larpack xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex \ ./configure --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex \ --with-arpack=1 ; then \ : ; \ else \ err=$?; \ echo "complex configure failed with exit value $err"; \ echo "===== show complex configure.log ====="; \ cat installed-arch-linux2-c-opt-complex/lib/slepc/conf/configure.log; \ echo "===== end complex configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Checking ARPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex Prefix install with double precision complex numbers SCALAPACK from SCALAPACK linked by PETSc ARPACK library flags: -lparpack -larpack xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real \ ./configure --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real \ --build-suffix="64" ; then \ : ; \ else \ err=$?; \ echo "64-bit real configure failed with exit value $err"; \ echo "===== show 64-bit real configure.log ====="; \ cat installed-arch-linux2-c-opt-64/lib/slepc/conf/configure.log; \ echo "===== end 64-bit real configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real Prefix install with double precision real numbers SCALAPACK from SCALAPACK linked by PETSc xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real xxx==========================================================================xxx if PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex \ ./configure --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex \ --build-suffix="64" ; then \ : ; \ else \ err=$?; \ echo "64-bit complex configure failed with exit value $err"; \ echo "===== show 64-bit complex configure.log ====="; \ cat installed-arch-linux2-c-opt-complex-64/lib/slepc/conf/configure.log; \ echo "===== end 64-bit complex configure.log ====="; \ (exit $err); \ fi Checking environment... Generating Fortran bindings... done Checking PETSc installation... done Checking LAPACK library... done Checking SCALAPACK... done Writing various configuration files... done ================================================================================ SLEPc Configuration ================================================================================ SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 SLEPc prefix directory: /usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex Prefix install with double precision complex numbers SCALAPACK from SCALAPACK linked by PETSc xxx==========================================================================xxx Configure stage complete. Now build the SLEPc library with: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex xxx==========================================================================xxx make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' debian/rules override_dh_auto_build make[1]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' dh_auto_build -plibslepc-real3.24-dev -pslepc3.24-doc -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt make[2]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================== Starting make run on sbuild at Mon, 05 Jan 2026 10:35:14 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real Using PETSc arch: installed-arch-linux2-c-opt ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 1 SLEPC_VERSION_DATE "Nov 07, 2025" SLEPC_VERSION_GIT "v3.24.1" SLEPC_VERSION_DATE_GIT "2025-11-07 09:19:15 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real --with-arpack=1 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.1+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-real/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_ARPACK 1 #define SLEPC_ARPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:arpack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 1 PETSC_VERSION_DATE "Oct 29, 2025" PETSC_VERSION_GIT "v3.24.1" PETSC_VERSION_DATE_GIT "2025-10-29 13:15:15 -0500" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --with-library-name-suffix=_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lscotch -lptscotcherr" --with-hypre=1 --with-hypre-include=/usr/include/hypre --with-hypre-lib=-lHYPRE --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --prefix=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real --PETSC_ARCH=riscv64-linux-gnu-real CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_F90_2PTR_ARG 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_HYPRE 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:hypre:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:superlu:superlu_dist:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SUPERLU 1 #define PETSC_HAVE_SUPERLU_DIST 1 #define PETSC_HAVE_SUPERLU_DIST_SINGLE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib" #define PETSC_LIB_NAME_SUFFIX "_real" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MISSING_LAPACK_lsame 1 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-8) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-8) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-8) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/lib -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/lib -lslepc_real -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_real -lHYPRE -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo1988 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.1+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/finitf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dlaed3m.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dlaed3m.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dmerg2.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dmerg2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dibtdc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dibtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsbtdc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dsbtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsrtdf.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/bdc/dsrtdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/arpack/arpack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/external/arpack/arpack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc_real.so.3.24 -o installed-arch-linux2-c-opt/lib/libslepc_real.so.3.24.1 @installed-arch-linux2-c-opt/lib/libslepc_real.so.3.24.1.args -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_real -lHYPRE -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' dh_auto_build -plibslepc-complex3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex make[2]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================== Starting make run on sbuild at Mon, 05 Jan 2026 10:39:59 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex Using PETSc arch: installed-arch-linux2-c-opt-complex ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 1 SLEPC_VERSION_DATE "Nov 07, 2025" SLEPC_VERSION_GIT "v3.24.1" SLEPC_VERSION_DATE_GIT "2025-11-07 09:19:15 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex --with-arpack=1 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.1+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc3.24/riscv64-linux-gnu-complex/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_ARPACK 1 #define SLEPC_ARPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:arpack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 1 PETSC_VERSION_DATE "Oct 29, 2025" PETSC_VERSION_GIT "v3.24.1" PETSC_VERSION_DATE_GIT "2025-10-29 13:15:15 -0500" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-debugging=0 --with-scalar-type=complex --with-library-name-suffix=_complex --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch --with-ptscotch-lib="-lptesmumps -lptscotch -lscotch -lptscotcherr" --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --with-superlu=1 --with-superlu-include=/usr/include/superlu --with-superlu-lib=-lsuperlu --with-superlu_dist=1 --with-superlu_dist-include=/usr/include/superlu-dist --with-superlu_dist-lib=-lsuperlu_dist --prefix=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex --PETSC_ARCH=riscv64-linux-gnu-complex CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_F90_2PTR_ARG 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:superlu:superlu_dist:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SUPERLU 1 #define PETSC_HAVE_SUPERLU_DIST 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib" #define PETSC_LIB_NAME_SUFFIX "_complex" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MISSING_LAPACK_lsame 1 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_COMPLEX 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-8) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-8) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-8) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/lib -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/lib -lslepc_complex -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo3849 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.1+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-complex mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-complex/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/arpack/arpack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/external/arpack/arpack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-complex/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/ciss/pcissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/ciss/pcissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-complex/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/ciss/pciss.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/ciss/pciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/ciss/ncissf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/ciss/ncissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-complex/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/ciss/nciss.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/ciss/nciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-complex/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-complex/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-complex/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-complex/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include -I/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/superlu-dist -I/usr/include/superlu -I/usr/include/scotch -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-complex/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc_complex.so.3.24 -o installed-arch-linux2-c-opt-complex/lib/libslepc_complex.so.3.24.1 @installed-arch-linux2-c-opt-complex/lib/libslepc_complex.so.3.24.1.args -lparpack -larpack -L/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps -lzmumps -lsmumps -lcmumps -lmumps_common -lpord -lscalapack-openmpi -lsuperlu_dist -lsuperlu -llapack -lblas -lptesmumps -lptscotch -lscotch -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' dh_auto_build -plibslepc64-real3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 make[2]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================== Starting make run on sbuild at Mon, 05 Jan 2026 10:44:44 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real Using PETSc arch: installed-arch-linux2-c-opt-64 ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 1 SLEPC_VERSION_DATE "Nov 07, 2025" SLEPC_VERSION_GIT "v3.24.1" SLEPC_VERSION_DATE_GIT "2025-11-07 09:19:15 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real --build-suffix=64 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.1+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-real/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 1 PETSC_VERSION_DATE "Oct 29, 2025" PETSC_VERSION_GIT "v3.24.1" PETSC_VERSION_DATE_GIT "2025-10-29 13:15:15 -0500" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-64-bit-indices --with-debugging=0 --with-library-name-suffix=64_real --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch_64i --with-ptscotch-lib="-lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr" --with-hypre=1 --with-hypre-include=/usr/include/hypre64 --with-hypre-lib=-lHYPRE64 --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --prefix=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real --PETSC_ARCH=riscv64-linux-gnu-real-64 CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_F90_2PTR_ARG 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_HYPRE 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:hypre:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib" #define PETSC_LIB_NAME_SUFFIX "64_real" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MISSING_LAPACK_lsame 1 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_64BIT_INDICES 1 #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-8) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-8) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-8) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/lib -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/lib -lslepc64_real -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_real -lHYPRE64 -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo5712 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.1+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-64 mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-64/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dlaed3m.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dlaed3m.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dmerg2.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dmerg2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dibtdc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dibtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsbtdc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dsbtdc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/bdc/dsrtdf.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/bdc/dsrtdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-64/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-64/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-64/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-64/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-64/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-64/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-64/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-64/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-64/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-64/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-64/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-64/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-64/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-64/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-64/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-64/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-64/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-64/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/include -I/usr/include/hypre64 -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-64/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc64_real.so.3.24 -o installed-arch-linux2-c-opt-64/lib/libslepc64_real.so.3.24.1 @installed-arch-linux2-c-opt-64/lib/libslepc64_real.so.3.24.1.args -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_real -lHYPRE64 -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' dh_auto_build -plibslepc64-complex3.24-dev -- V=1 \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 make -j4 V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 make[2]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' sed: -e expression #1, char 47: unknown option to `s' /usr/bin/bash: line 4: [: too many arguments make[4]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================== Starting make run on sbuild at Mon, 05 Jan 2026 10:49:30 +0000 Machine characteristics: Linux sbuild 6.6.87-win2030 #2025.04.20.18.43+bb0c69aea SMP Sun Apr 20 18:58:14 UTC 2025 riscv64 GNU/Linux ----------------------------------------- Using SLEPc directory: /build/reproducible-path/slepc-3.24.1+dfsg1 Using PETSc directory: /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex Using PETSc arch: installed-arch-linux2-c-opt-complex-64 ----------------------------------------- SLEPC_VERSION_RELEASE 1 SLEPC_VERSION_MAJOR 3 SLEPC_VERSION_MINOR 24 SLEPC_VERSION_SUBMINOR 1 SLEPC_VERSION_DATE "Nov 07, 2025" SLEPC_VERSION_GIT "v3.24.1" SLEPC_VERSION_DATE_GIT "2025-11-07 09:19:15 +0100" ----------------------------------------- Using SLEPc configure options: --prefix=/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex --build-suffix=64 Using SLEPc configuration flags: #define SLEPC_PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex" #define SLEPC_PETSC_ARCH "" #define SLEPC_DIR "/build/reproducible-path/slepc-3.24.1+dfsg1" #define SLEPC_LIB_DIR "/usr/lib/slepcdir/slepc64-3.24/riscv64-linux-gnu-complex/lib" #define SLEPC_HAVE_SCALAPACK 1 #define SLEPC_SCALAPACK_HAVE_UNDERSCORE 1 #define SLEPC_HAVE_PACKAGES ":scalapack:" ----------------------------------------- PETSC_VERSION_RELEASE 1 PETSC_VERSION_MAJOR 3 PETSC_VERSION_MINOR 24 PETSC_VERSION_SUBMINOR 1 PETSC_VERSION_DATE "Oct 29, 2025" PETSC_VERSION_GIT "v3.24.1" PETSC_VERSION_DATE_GIT "2025-10-29 13:15:15 -0500" ----------------------------------------- Using PETSc configure options: --build=riscv64-linux-gnu --prefix=/usr --includedir=/include --mandir=/share/man --infodir=/share/info --sysconfdir=/etc --localstatedir=/var --with-option-checking=0 --with-silent-rules=0 --libdir=/lib/riscv64-linux-gnu --runstatedir=/run --with-maintainer-mode=0 --with-dependency-tracking=0 --with-64-bit-indices --with-debugging=0 --with-scalar-type=complex --with-library-name-suffix=64_complex --with-shared-libraries --with-pic=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --with-cxx-dialect=C++11 --with-opencl=1 --with-blas-lib=-lblas --with-lapack-lib=-llapack --with-scalapack=1 --with-scalapack-lib=-lscalapack-openmpi --with-fftw=1 --with-fftw-include="[]" --with-fftw-lib="-lfftw3 -lfftw3_mpi" --with-yaml=1 --with-valgrind=1 --with-hdf5-include=/usr/include/hdf5/openmpi --with-hdf5-lib="-L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -lhdf5 -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi " --CXX_LINKER_FLAGS=-Wl,--no-as-needed --with-ptscotch=1 --with-ptscotch-include=/usr/include/scotch_64i --with-ptscotch-lib="-lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr" --with-mumps=1 --with-mumps-include="[]" --with-mumps-lib="-ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64" --with-suitesparse=1 --with-suitesparse-include=/usr/include/suitesparse --with-suitesparse-lib="-lspqr -lumfpack -lamd -lcholmod -lklu" --prefix=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex --PETSC_ARCH=riscv64-linux-gnu-complex-64 CFLAGS="-g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" CXXFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC" FCFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" FFLAGS="-g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0" CPPFLAGS="-Wdate-time -D_FORTIFY_SOURCE=2" LDFLAGS="-Wl,-z,relro -fPIC" MAKEFLAGS= Using PETSc configuration flags: #define PETSC_ARCH "" #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned(size))) #define PETSC_BLASLAPACK_UNDERSCORE 1 #define PETSC_CLANGUAGE_C 1 #define PETSC_CXX_RESTRICT __restrict #define PETSC_DEPRECATED_ENUM_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_FUNCTION_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_MACRO_BASE(string_literal_why) PETSC_DEPRECATED_MACRO_BASE_(GCC warning string_literal_why) #define PETSC_DEPRECATED_MACRO_BASE_(why) _Pragma(#why) #define PETSC_DEPRECATED_OBJECT_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DEPRECATED_TYPEDEF_BASE(string_literal_why) __attribute__((deprecated(string_literal_why))) #define PETSC_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex" #define PETSC_DIR_SEPARATOR '/' #define PETSC_FORTRAN_CHARLEN_T size_t #define PETSC_FUNCTION_NAME_C __func__ #define PETSC_FUNCTION_NAME_CXX __func__ #define PETSC_HAVE_ACCESS 1 #define PETSC_HAVE_ATOLL 1 #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #define PETSC_HAVE_BUILTIN_EXPECT 1 #define PETSC_HAVE_BZERO 1 #define PETSC_HAVE_C99_COMPLEX 1 #define PETSC_HAVE_CLOCK 1 #define PETSC_HAVE_CXX 1 #define PETSC_HAVE_CXXABI_H 1 #define PETSC_HAVE_CXX_ATOMIC 1 #define PETSC_HAVE_CXX_COMPLEX 1 #define PETSC_HAVE_CXX_COMPLEX_FIX 1 #define PETSC_HAVE_CXX_DIALECT_CXX11 1 #define PETSC_HAVE_DLADDR 1 #define PETSC_HAVE_DLCLOSE 1 #define PETSC_HAVE_DLERROR 1 #define PETSC_HAVE_DLFCN_H 1 #define PETSC_HAVE_DLOPEN 1 #define PETSC_HAVE_DLSYM 1 #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #define PETSC_HAVE_DRAND48 1 #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #define PETSC_HAVE_ERF 1 #define PETSC_HAVE_EXECUTABLE_EXPORT 1 #define PETSC_HAVE_F90_2PTR_ARG 1 #define PETSC_HAVE_FCNTL_H 1 #define PETSC_HAVE_FENV_H 1 #define PETSC_HAVE_FE_VALUES 1 #define PETSC_HAVE_FFTW 1 #define PETSC_HAVE_FLOAT_H 1 #define PETSC_HAVE_FORK 1 #define PETSC_HAVE_FORTRAN_FLUSH 1 #define PETSC_HAVE_FORTRAN_FREE_LINE_LENGTH_NONE 1 #define PETSC_HAVE_FORTRAN_TYPE_STAR 1 #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #define PETSC_HAVE_GETCWD 1 #define PETSC_HAVE_GETDOMAINNAME 1 #define PETSC_HAVE_GETHOSTBYNAME 1 #define PETSC_HAVE_GETHOSTNAME 1 #define PETSC_HAVE_GETPAGESIZE 1 #define PETSC_HAVE_GETRUSAGE 1 #define PETSC_HAVE_HDF5 1 #define PETSC_HAVE_INTTYPES_H 1 #define PETSC_HAVE_ISINF 1 #define PETSC_HAVE_ISNAN 1 #define PETSC_HAVE_ISNORMAL 1 #define PETSC_HAVE_LGAMMA 1 #define PETSC_HAVE_LINUX 1 #define PETSC_HAVE_LOG2 1 #define PETSC_HAVE_LSEEK 1 #define PETSC_HAVE_MALLOC_H 1 #define PETSC_HAVE_MEMMOVE 1 #define PETSC_HAVE_MKSTEMP 1 #define PETSC_HAVE_MPIEXEC_ENVIRONMENTAL_VARIABLE OMP #define PETSC_HAVE_MPIIO 1 #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #define PETSC_HAVE_MPI_COMBINER_DUP 1 #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #define PETSC_HAVE_MPI_COUNT 1 #define PETSC_HAVE_MPI_F90MODULE 1 #define PETSC_HAVE_MPI_F90MODULE_VISIBILITY 1 #define PETSC_HAVE_MPI_FEATURE_DYNAMIC_WINDOW 1 #define PETSC_HAVE_MPI_GET_ACCUMULATE 1 #define PETSC_HAVE_MPI_GET_LIBRARY_VERSION 1 #define PETSC_HAVE_MPI_INIT_THREAD 1 #define PETSC_HAVE_MPI_INT64_T 1 #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #define PETSC_HAVE_MPI_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_NONBLOCKING_COLLECTIVES 1 #define PETSC_HAVE_MPI_ONE_SIDED 1 #define PETSC_HAVE_MPI_PERSISTENT_NEIGHBORHOOD_COLLECTIVES 1 #define PETSC_HAVE_MPI_PROCESS_SHARED_MEMORY 1 #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #define PETSC_HAVE_MPI_RGET 1 #define PETSC_HAVE_MPI_WIN_CREATE 1 #define PETSC_HAVE_MUMPS 1 #define PETSC_HAVE_NANOSLEEP 1 #define PETSC_HAVE_NETDB_H 1 #define PETSC_HAVE_NETINET_IN_H 1 #define PETSC_HAVE_NO_FINITE_MATH_ONLY 1 #define PETSC_HAVE_OPENCL 1 #define PETSC_HAVE_OPENMPI 1 #define PETSC_HAVE_PACKAGES ":amd:blaslapack:cholmod:fftw3:hdf5:klu:mathlib:mpi:mumps:opencl:pthread:ptscotch:regex:scalapack:spqr:umfpack:x11:yaml:" #define PETSC_HAVE_POPEN 1 #define PETSC_HAVE_POSIX_MEMALIGN 1 #define PETSC_HAVE_PTHREAD 1 #define PETSC_HAVE_PTHREAD_MUTEX 1 #define PETSC_HAVE_PTSCOTCH 1 #define PETSC_HAVE_PWD_H 1 #define PETSC_HAVE_RAND 1 #define PETSC_HAVE_READLINK 1 #define PETSC_HAVE_REALPATH 1 #define PETSC_HAVE_REGEX 1 #define PETSC_HAVE_RTLD_DEFAULT 1 #define PETSC_HAVE_RTLD_GLOBAL 1 #define PETSC_HAVE_RTLD_LAZY 1 #define PETSC_HAVE_RTLD_LOCAL 1 #define PETSC_HAVE_RTLD_NOW 1 #define PETSC_HAVE_SCALAPACK 1 #define PETSC_HAVE_SETJMP_H 1 #define PETSC_HAVE_SHMGET 1 #define PETSC_HAVE_SLEEP 1 #define PETSC_HAVE_SNPRINTF 1 #define PETSC_HAVE_SOCKET 1 #define PETSC_HAVE_SO_REUSEADDR 1 #define PETSC_HAVE_STDATOMIC_H 1 #define PETSC_HAVE_STDINT_H 1 #define PETSC_HAVE_STRCASECMP 1 #define PETSC_HAVE_STRINGS_H 1 #define PETSC_HAVE_STRUCT_SIGACTION 1 #define PETSC_HAVE_SUITESPARSE 1 #define PETSC_HAVE_SYS_PARAM_H 1 #define PETSC_HAVE_SYS_PROCFS_H 1 #define PETSC_HAVE_SYS_RESOURCE_H 1 #define PETSC_HAVE_SYS_SOCKET_H 1 #define PETSC_HAVE_SYS_TIMES_H 1 #define PETSC_HAVE_SYS_TIME_H 1 #define PETSC_HAVE_SYS_TYPES_H 1 #define PETSC_HAVE_SYS_UTSNAME_H 1 #define PETSC_HAVE_SYS_WAIT_H 1 #define PETSC_HAVE_TAU_PERFSTUBS 1 #define PETSC_HAVE_TGAMMA 1 #define PETSC_HAVE_TIME 1 #define PETSC_HAVE_TIME_H 1 #define PETSC_HAVE_UNAME 1 #define PETSC_HAVE_UNISTD_H 1 #define PETSC_HAVE_USLEEP 1 #define PETSC_HAVE_VA_COPY 1 #define PETSC_HAVE_VSNPRINTF 1 #define PETSC_HAVE_X 1 #define PETSC_HAVE_YAML 1 #define PETSC_HDF5_HAVE_PARALLEL 1 #define PETSC_HDF5_HAVE_SZLIB 1 #define PETSC_HDF5_HAVE_ZLIB 1 #define PETSC_INTPTR_T intptr_t #define PETSC_INTPTR_T_FMT "#" PRIxPTR #define PETSC_IS_COLORING_MAX USHRT_MAX #define PETSC_IS_COLORING_VALUE_TYPE short #define PETSC_IS_COLORING_VALUE_TYPE_F integer2 #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #define PETSC_LIB_DIR "/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib" #define PETSC_LIB_NAME_SUFFIX "64_complex" #define PETSC_MAX_PATH_LEN 4096 #define PETSC_MEMALIGN 16 #define PETSC_MISSING_LAPACK_lsame 1 #define PETSC_MPICC_SHOW "gcc -I/usr/lib/riscv64-linux-gnu/openmpi/include -I/usr/lib/riscv64-linux-gnu/openmpi/include/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -lmpi" #define PETSC_MPIU_IS_COLORING_VALUE_TYPE MPI_UNSIGNED_SHORT #define PETSC_OMAKE "/usr/bin/make --no-print-directory" #define PETSC_PREFETCH_HINT_NTA 0 #define PETSC_PREFETCH_HINT_T0 3 #define PETSC_PREFETCH_HINT_T1 2 #define PETSC_PREFETCH_HINT_T2 1 #define PETSC_PYTHON_EXE "/usr/bin/python3" #define PETSC_Prefetch(a,b,c) __builtin_prefetch((a),(b),(c)) #define PETSC_REPLACE_DIR_SEPARATOR '\\' #define PETSC_SIGNAL_CAST #define PETSC_SIZEOF_INT 4 #define PETSC_SIZEOF_LONG 8 #define PETSC_SIZEOF_LONG_LONG 8 #define PETSC_SIZEOF_SIZE_T 8 #define PETSC_SIZEOF_VOID_P 8 #define PETSC_SLSUFFIX "so" #define PETSC_UINTPTR_T uintptr_t #define PETSC_UINTPTR_T_FMT "#" PRIxPTR #define PETSC_UNUSED __attribute((unused)) #define PETSC_USE_64BIT_INDICES 1 #define PETSC_USE_AVX512_KERNELS 1 #define PETSC_USE_COMPLEX 1 #define PETSC_USE_CTABLE 1 #define PETSC_USE_DEBUGGER "gdb" #define PETSC_USE_DMLANDAU_2D 1 #define PETSC_USE_FORTRAN_BINDINGS 1 #define PETSC_USE_INFO 1 #define PETSC_USE_ISATTY 1 #define PETSC_USE_LOG 1 #define PETSC_USE_MALLOC_COALESCED 1 #define PETSC_USE_PROC_FOR_SIZE 1 #define PETSC_USE_REAL_DOUBLE 1 #define PETSC_USE_SHARED_LIBRARIES 1 #define PETSC_USE_SINGLE_LIBRARY 1 #define PETSC_USE_SOCKET_VIEWER 1 #define PETSC_USE_VISIBILITY_C 1 #define PETSC_USE_VISIBILITY_CXX 1 #define PETSC_USING_64BIT_PTR 1 #define PETSC_USING_F2003 1 #define PETSC_USING_F90FREEFORM 1 #define PETSC__BSD_SOURCE 1 #define PETSC__DEFAULT_SOURCE 1 #define PETSC__GNU_SOURCE 1 ----------------------------------------- Using C/C++ include paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using C compile: mpicc -o gmakeinfo -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC C compiler version: gcc (Debian 15.2.0-8) 15.2.0 Using C++ compile: mpicxx -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -std=c++11 -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi C++ compiler version: g++ (Debian 15.2.0-8) 15.2.0 Using Fortran include/module paths: -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Using Fortran compile: mpif90 -o gmakeinfo -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi Fortran compiler version: GNU Fortran (Debian 15.2.0-8) 15.2.0 ----------------------------------------- Using C/C++ linker: mpicc Using C/C++ flags: -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC Using Fortran linker: mpif90 Using Fortran flags: -Wl,-z,relro -fPIC -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 ----------------------------------------- Using libraries: -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/lib -L/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/lib -lslepc64_complex -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ ------------------------------------------ Using mpiexec: /usr/bin/mpiexec --oversubscribe ------------------------------------------ Using MAKE: /usr/bin/make Default MAKEFLAGS: MAKE_NP:4 MAKE_LOAD:4.0 MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo7567 --no-print-directory -- V=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 ========================================== /usr/bin/make --print-directory -f gmakefile -l4.0 --output-sync=recurse V=1 slepc_libs make[5]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' /usr/bin/python3 /usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/share/petsc/examples/config/gmakegen.py --petsc-arch= --pkg-dir=/build/reproducible-path/slepc-3.24.1+dfsg1 --pkg-name=slepc --pkg-pkgs=sys,eps,svd,pep,nep,mfn,lme --pkg-arch=installed-arch-linux2-c-opt-complex-64 mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/impls/tensor/bvtensorf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/impls/tensor/bvtensorf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvbiorthogf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvbiorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvcontourf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvcontourf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvglobalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvglobalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvfuncf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvopsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvkrylovf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvkrylovf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/bv/interface/bvorthogf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/bv/interface/bvorthogf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/gsvd/dsgsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/hsvd/dshsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/hsvd/dshsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/pep/dspepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/pep/dspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/nep/dsnepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/nep/dsnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/impls/svd/dssvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/impls/svd/dssvdf.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-mod/slepcsysmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/ftn-mod/slepcsysmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsprivf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsprivf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/combine/fncombinef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/combine/fncombinef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/phi/fnphif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/phi/fnphif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/ds/interface/dsopsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/ds/interface/dsopsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/impls/rational/fnrationalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/impls/rational/fnrationalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/ellipse/rgellipsef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/ellipse/rgellipsef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/interval/rgintervalf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/interval/rgintervalf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/polygon/rgpolygonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/polygon/rgpolygonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/fn/interface/fnbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/fn/interface/fnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/cayley/cayleyf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/cayley/cayleyf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/impls/ring/rgringf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/impls/ring/rgringf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/filter/filterf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/filter/filterf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/shell/shellf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/shell/shellf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/impls/precond/precondf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/impls/precond/precondf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/rg/interface/rgbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/rg/interface/rgbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stslesf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stslesf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stfuncf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stfuncf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stsetf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stsetf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/finitf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/finitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/mat/matstructf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/mat/matstructf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/classes/st/interface/stsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/classes/st/interface/stsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/mat/matutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/mat/matutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcinitf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcinitf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/slepcscf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/slepcscf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/vec/veccompf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/vec/veccompf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/sys/vec/vecutilf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/sys/vec/vecutilf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/contiguous/contig.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/contiguous/contig.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/mat/bvmat.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/mat/bvmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/svec/svec.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/svec/svec.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/vecs/vecs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/vecs/vecs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/impls/tensor/bvtensor.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/impls/tensor/bvtensor.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbiorthog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvbiorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvblas.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvblas.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvcontour.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvcontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvfunc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvglobal.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvglobal.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvops.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvorthog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvorthog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvlapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/interface/bvregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/interface/bvregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/dsutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/dsutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghep/dsghep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghep/dsghep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/hz.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/hz.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/invit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/invit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gnhep/dsgnhep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/gnhep/dsgnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/ghiep/dsghiep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/ghiep/dsghiep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hsvd/dshsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/hsvd/dshsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/gsvd/dsgsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/gsvd/dsgsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/hep/dshep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/hep/dshep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhepts/dsnhepts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nhepts/dsnhepts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nhep/dsnhep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nhep/dsnhep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/dspep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/pep/dspep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/pep/ftn-custom/zdspepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/nep/dsnep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/nep/dsnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dsops.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dsops.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/combine/fncombine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/combine/fncombine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/impls/svd/dssvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/impls/svd/dssvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/interface/dspriv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/interface/dspriv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/invsqrt/fninvsqrt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/invsqrt/fninvsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/fnutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/fnutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/log/fnlog.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/log/fnlog.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/exp/fnexp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/exp/fnexp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/phi/fnphi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/phi/fnphi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/ftn-custom/zrational.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/rational/ftn-custom/zrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/rational/fnrational.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/rational/fnrational.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/interface/fnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/impls/sqrt/fnsqrt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/impls/sqrt/fnsqrt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ellipse/rgellipse.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/ellipse/rgellipse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/interval/rginterval.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/interval/rginterval.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/polygon/ftn-custom/zpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/interface/fnbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/interface/fnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/polygon/rgpolygon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/polygon/rgpolygon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/impls/ring/rgring.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/impls/ring/rgring.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/interface/rgregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/interface/rgbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/interface/rgbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/cayley/cayley.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/cayley/cayley.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/chebyshev.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/chebyshev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/ftn-custom/zshell.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shell/ftn-custom/zshell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filter.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/filter.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/precond/precond.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/precond/precond.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shell/shell.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shell/shell.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/sinvert/sinvert.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/sinvert/sinvert.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/shift/shift.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/shift/shift.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stset.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stset.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stfunc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stfunc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stshellmat.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stshellmat.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/dlregisslepc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/dlregisslepc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/impls/filter/filtlan.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/impls/filter/filtlan.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsles.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stsles.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/finit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/finit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/ftn-custom/zstart.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/ftn-custom/zstart.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matstruct.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/mat/matstruct.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepccontour.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepccontour.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcinit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcinit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/interface/stsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/interface/stsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/slepcsc.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/slepcsc.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/mat/matutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/mat/matutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/pool.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/pool.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/cg/lobpcg/lobpcgf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/cg/lobpcg/lobpcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/vecutil.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/vecutil.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/cg/rqcg/rqcgf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/cg/rqcg/rqcgf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/ciss/cissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/ciss/cissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/davidson/gd/gdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/davidson/gd/gdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/davidson/jd/jdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/davidson/jd/jdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/arnoldi/arnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/arnoldi/arnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/krylovschur/krylovschurf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/krylovschur/krylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/krylov/lanczos/lanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/krylov/lanczos/lanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/lyapii/lyapiif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/lyapii/lyapiif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/impls/power/powerf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/impls/power/powerf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/vec/veccomp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/vec/veccomp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/dlregisepsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/dlregisepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epssetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epssetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epssolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epssolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/eps/interface/epsviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/eps/interface/epsviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/rqcg/rqcg.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/cg/rqcg/rqcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/davidson.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/davidson.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/cg/lobpcg/lobpcg.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/cg/lobpcg/lobpcg.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdgd2.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdgd2.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdcalcpairs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdcalcpairs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdinitv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdinitv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/ciss/ciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/ciss/ciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdtestconv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdtestconv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdschm.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdschm.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdimprovex.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdimprovex.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdutils.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdutils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/dvdupdatev.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/dvdupdatev.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/gd/gd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/gd/gd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/davidson/jd/jd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/davidson/jd/jd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/arnoldi/arnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/arnoldi/arnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/external/scalapack/scalapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/external/scalapack/scalapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ftn-custom/zkrylovschurf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/epskrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/epskrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-hamilt.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-hamilt.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-indef.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-indef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/krylovschur.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/krylovschur.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-bse.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-bse.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-twosided.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lapack/lapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/lapack/lapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/lanczos/lanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/lanczos/lanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/lyapii/lyapii.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/lyapii/lyapii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/dlregiseps.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/dlregiseps.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/subspace/subspace.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/subspace/subspace.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/krylov/krylovschur/ks-slice.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/krylov/krylovschur/ks-slice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/impls/power/power.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/impls/power/power.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/ftn-custom/zepsf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/ftn-custom/zepsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epssolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epssetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epssetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/cross/crossf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/cross/crossf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/cyclic/cyclicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/cyclic/cyclicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/lanczos/gklanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/lanczos/gklanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/dlregissvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/dlregissvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/impls/trlanczos/trlanczosf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/impls/trlanczos/trlanczosf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/interface/epsview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/interface/epsview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svddefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svddefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/svd/interface/svdviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/svd/interface/svdviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/external/scalapack/svdscalap.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/external/scalapack/svdscalap.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cross/cross.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/cross/cross.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lanczos/gklanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/lanczos/gklanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/randomized/rsvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/randomized/rsvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/lapack/svdlapack.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/lapack/svdlapack.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/cyclic/cyclic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/cyclic/cyclic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/dlregissvd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/dlregissvd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/ftn-custom/zsvdf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/ftn-custom/zsvdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svddefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svddefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/interface/svdview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/interface/svdview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/jd/pjdf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/jd/pjdf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/ciss/pcissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/ciss/pcissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/qarnoldi/qarnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/qarnoldi/qarnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/stoar/qslicef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/stoar/qslicef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/toar/ptoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/toar/ptoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/krylov/stoar/stoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/krylov/stoar/stoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/dlregispepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/dlregispepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/impls/linear/linearf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/impls/linear/linearf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/impls/trlanczos/trlanczos.c -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/impls/trlanczos/trlanczos.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/pep/interface/pepviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/pep/interface/pepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/pepkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/pepkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/ftn-custom/zstoarf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/qarnoldi/qarnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/qarnoldi/qarnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/ciss/pciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/ciss/pciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/stoar.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/stoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/jd/pjd.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/jd/pjd.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/stoar/qslice.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/stoar/qslice.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/qeplin.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/linear/qeplin.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/ptoar.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/toar/ptoar.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/dlregispep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/dlregispep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/peputils.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/peputils.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/linear/linear.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/linear/linear.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/ftn-custom/zpepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/ftn-custom/zpepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/impls/krylov/toar/nrefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/impls/krylov/toar/nrefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/interpol/interpolf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/interpol/interpolf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/ciss/ncissf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/ciss/ncissf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/narnoldi/narnoldif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/narnoldi/narnoldif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/nleigs/nleigs-fullbf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/nleigs/nleigs-fullbf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/nleigs/nleigsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/nleigs/nleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/rii/riif.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/rii/riif.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/peprefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/peprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/interface/pepview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/interface/pepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/impls/slp/slpf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/impls/slp/slpf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/dlregisnepf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/dlregisnepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepdefaultf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepdefaultf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepresolvf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepresolvf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/nep/interface/nepviewf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/nep/interface/nepviewf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/interpol/interpol.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/interpol/interpol.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/ftn-custom/znleigsf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/ftn-custom/znleigsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/narnoldi/narnoldi.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/narnoldi/narnoldi.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/ciss/nciss.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/ciss/nciss.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs-fullb.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/nleigs-fullb.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nepdefl.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nepdefl.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/rii/rii.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/rii/rii.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp-twosided.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/slp/slp-twosided.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/dlregisnep.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/dlregisnep.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/ftn-custom/znepf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/ftn-custom/znepf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/slp/slp.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/slp/slp.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepdefault.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepdefault.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/impls/nleigs/nleigs.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/impls/nleigs/nleigs.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepresolv.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepresolv.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/dlregismfnf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/dlregismfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnbasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnbasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnmonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnmonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/neprefine.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/neprefine.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnsetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnsetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/interface/nepview.c -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/interface/nepview.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnsolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnsolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/mfn/interface/mfnoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/mfn/interface/mfnoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/dlregismfn.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/dlregismfn.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/expokit/mfnexpokit.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/impls/expokit/mfnexpokit.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/impls/krylov/mfnkrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/impls/krylov/mfnkrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/ftn-custom/zmfnf.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/ftn-custom/zmfnf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnmon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnmon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnbasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnbasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/dlregislmef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/dlregislmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnsetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/interface/mfnsolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/interface/mfnsolve.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmedensef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmedensef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmebasicf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmebasicf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmemonf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmemonf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmeoptsf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmeoptsf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmesetupf.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmesetupf.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/ftn/lme/interface/lmesolvef.c -o installed-arch-linux2-c-opt-complex-64/obj/ftn/lme/interface/lmesolvef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/dlregislme.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/dlregislme.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/ftn-custom/zlmef.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/ftn-custom/zlmef.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/impls/krylov/lmekrylov.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/impls/krylov/lmekrylov.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmebasic.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmebasic.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeregis.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmeregis.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmemon.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmemon.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmeopts.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmeopts.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmedense.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmedense.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesetup.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmesetup.o mpicc -c -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -Wdate-time -D_FORTIFY_SOURCE=2 -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/interface/lmesolve.c -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/interface/lmesolve.o mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/bv/ftn-mod/slepcbvmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/bv/ftn-mod/slepcbvmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/rg/ftn-mod/slepcrgmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/rg/ftn-mod/slepcrgmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/fn/ftn-mod/slepcfnmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/fn/ftn-mod/slepcfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/lme/ftn-mod/slepclmemod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/lme/ftn-mod/slepclmemod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/ds/ftn-mod/slepcdsmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/ds/ftn-mod/slepcdsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/mfn/ftn-mod/slepcmfnmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/mfn/ftn-mod/slepcmfnmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/sys/classes/st/ftn-mod/slepcstmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/sys/classes/st/ftn-mod/slepcstmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/eps/ftn-mod/slepcepsmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/eps/ftn-mod/slepcepsmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/svd/ftn-mod/slepcsvdmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/svd/ftn-mod/slepcsvdmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/pep/ftn-mod/slepcpepmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/pep/ftn-mod/slepcpepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpif90 -c -g -O2 -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -fPIC -ffree-line-length-0 -I/build/reproducible-path/slepc-3.24.1+dfsg1/include -I/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include -I/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/include -I/usr/include/suitesparse -I/usr/include/scotch_64i -I/usr/include/hdf5/openmpi -MMD -MP /build/reproducible-path/slepc-3.24.1+dfsg1/src/nep/ftn-mod/slepcnepmod.F90 -o installed-arch-linux2-c-opt-complex-64/obj/src/nep/ftn-mod/slepcnepmod.o -J/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/include mpicc -Wl,-z,relro -fPIC -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-z,relro -fPIC -shared -g -O2 -Werror=implicit-function-declaration -ffile-prefix-map=/build/reproducible-path/petsc-3.24.1+dfsg1=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -Wl,-soname,libslepc64_complex.so.3.24 -o installed-arch-linux2-c-opt-complex-64/lib/libslepc64_complex.so.3.24.1 @installed-arch-linux2-c-opt-complex-64/lib/libslepc64_complex.so.3.24.1.args -L/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex/lib -L/usr/lib/riscv64-linux-gnu/hdf5/openmpi -L/usr/lib/riscv64-linux-gnu/openmpi/lib -L/usr/lib/gcc/riscv64-linux-gnu/15 -L/lib/riscv64-linux-gnu -L/usr/lib/riscv64-linux-gnu -lpetsc64_complex -lspqr -lumfpack -lamd -lcholmod -lklu -lfftw3 -lfftw3_mpi -ldmumps_64 -lzmumps_64 -lsmumps_64 -lcmumps_64 -lmumps_common_64 -lpord_64 -lscalapack-openmpi -llapack -lblas -lptesmumps_64i -lptscotch_64i -lscotch_64i -lptscotcherr -lhdf5 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lm -lOpenCL -lyaml -lX11 -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgfortran -lm -lgcc_s -lstdc++ make[5]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' make[4]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' ========================================= Now to install the library do: make SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex install ========================================= make[2]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' make[1]: Leaving directory '/build/reproducible-path/slepc-3.24.1+dfsg1' debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' set -e; \ if [ "yes" = "no" ]; then \ echo Tests have been disabled on riscv64; \ else \ dh_auto_test -plibslepc-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root"; \ dh_auto_test -plibslepc-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" ; \ dh_auto_test -plibslepc64-real3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt-64 \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-64/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root"; \ dh_auto_test -plibslepc64-complex3.24-dev -- \ SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 \ PETSC_DIR=/usr/lib/petscdir/petsc64-3.24/riscv64-linux-gnu-complex PETSC_ARCH=installed-arch-linux2-c-opt-complex-64 \ LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt-complex-64/lib \ OMP_NUM_THREADS=1 \ MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" ; \ fi make -j4 test TESTSUITEFLAGS="-j4 --verbose" VERBOSE=1 SLEPC_DIR=/build/reproducible-path/slepc-3.24.1\+dfsg1 PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real PETSC_ARCH=installed-arch-linux2-c-opt LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1\+dfsg1/installed-arch-linux2-c-opt/lib OMP_NUM_THREADS=1 MPIEXEC="mpiexec --oversubscribe --allow-run-as-root" make[2]: Entering directory '/build/reproducible-path/slepc-3.24.1+dfsg1' Using MAKEFLAGS: -j4 --jobserver-auth=fifo:/tmp/GMfifo9680 -- MPIEXEC=mpiexec --oversubscribe --allow-run-as-root OMP_NUM_THREADS=1 LD_LIBRARY_PATH=:/build/reproducible-path/slepc-3.24.1+dfsg1/installed-arch-linux2-c-opt/lib PETSC_ARCH=installed-arch-linux2-c-opt PETSC_DIR=/usr/lib/petscdir/petsc3.24/riscv64-linux-gnu-real SLEPC_DIR=/build/reproducible-path/slepc-3.24.1+dfsg1 VERBOSE=1 TESTSUITEFLAGS=-j4 --verbose Use "/usr/bin/make V=1" to see verbose compile lines, "/usr/bin/make V=0" to suppress. RM test-rm-sys.cu RM test-rm-sys.cxx RM test-rm-sys.F FC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1f.o FC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test14f.o FC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1f.o FC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7f.o FC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1f.o RM test-rm-sys.kokkos.cxx RM test-rm-sys.hip.cpp RM test-rm-sys.sycl.cxx RM test-rm-sys.raja.cxx RM test-rm-eps.cu RM test-rm-eps.cxx RM test-rm-eps.F FC installed-arch-linux2-c-opt/tests/eps/tests/test14f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test15f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test17f.o FC installed-arch-linux2-c-opt/tests/eps/tests/test7f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex10f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex1f.o FC installed-arch-linux2-c-opt/tests/eps/tutorials/ex6f.o RM test-rm-eps.kokkos.cxx RM test-rm-eps.hip.cpp RM test-rm-eps.sycl.cxx RM test-rm-eps.raja.cxx RM test-rm-svd.cu RM test-rm-svd.cxx RM test-rm-svd.F FC installed-arch-linux2-c-opt/tests/svd/tests/test4f.o FC installed-arch-linux2-c-opt/tests/svd/tutorials/ex15f.o RM test-rm-svd.kokkos.cxx RM test-rm-svd.hip.cpp RM test-rm-svd.sycl.cxx RM test-rm-svd.raja.cxx RM test-rm-pep.cu RM test-rm-pep.cxx RM test-rm-pep.F FC installed-arch-linux2-c-opt/tests/pep/tests/test3f.o FC installed-arch-linux2-c-opt/tests/pep/tutorials/ex16f.o RM test-rm-pep.kokkos.cxx RM test-rm-pep.hip.cpp RM test-rm-pep.sycl.cxx RM test-rm-pep.raja.cxx RM test-rm-nep.cu RM test-rm-nep.cxx RM test-rm-nep.F FC installed-arch-linux2-c-opt/tests/nep/tests/test2f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex20f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex22f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex27f.o FC installed-arch-linux2-c-opt/tests/nep/tutorials/ex54f.o RM test-rm-nep.kokkos.cxx RM test-rm-nep.hip.cpp RM test-rm-nep.sycl.cxx RM test-rm-nep.raja.cxx RM test-rm-mfn.cu RM test-rm-mfn.cxx RM test-rm-mfn.F FC installed-arch-linux2-c-opt/tests/mfn/tests/test3f.o FC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23f.o RM test-rm-mfn.kokkos.cxx RM test-rm-mfn.hip.cpp RM test-rm-mfn.sycl.cxx RM test-rm-mfn.raja.cxx RM test-rm-lme.cu RM test-rm-lme.cxx RM test-rm-lme.F RM test-rm-lme.F90 RM test-rm-lme.kokkos.cxx RM test-rm-lme.hip.cpp RM test-rm-lme.sycl.cxx RM test-rm-lme.raja.cxx CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test10.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test11.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test14.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test15.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test16.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test17.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test18.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test19.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test15.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test16.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test17.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test18.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test19.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test20.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test21.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test22.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test23.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test24.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test25.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test26.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test27.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test10.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test11.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test12.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test13.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test2.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test5.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test6.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test7.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test8.o CC installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test9.o CC installed-arch-linux2-c-opt/tests/sys/mat/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/tests/test1.o CC installed-arch-linux2-c-opt/tests/sys/tests/test3.o CC installed-arch-linux2-c-opt/tests/sys/tests/test4.o CC installed-arch-linux2-c-opt/tests/sys/tutorials/ex33.o CC installed-arch-linux2-c-opt/tests/sys/vec/tests/test1.o FLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test14f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7f FLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1f CC installed-arch-linux2-c-opt/tests/eps/tests/test1.o CC installed-arch-linux2-c-opt/tests/eps/tests/test10.o CC installed-arch-linux2-c-opt/tests/eps/tests/test11.o CC installed-arch-linux2-c-opt/tests/eps/tests/test12.o CC installed-arch-linux2-c-opt/tests/eps/tests/test13.o CC installed-arch-linux2-c-opt/tests/eps/tests/test14.o CC installed-arch-linux2-c-opt/tests/eps/tests/test16.o CC installed-arch-linux2-c-opt/tests/eps/tests/test17.o CC installed-arch-linux2-c-opt/tests/eps/tests/test18.o CC installed-arch-linux2-c-opt/tests/eps/tests/test19.o CC installed-arch-linux2-c-opt/tests/eps/tests/test2.o CC installed-arch-linux2-c-opt/tests/eps/tests/test20.o CC installed-arch-linux2-c-opt/tests/eps/tests/test21.o CC installed-arch-linux2-c-opt/tests/eps/tests/test22.o CC installed-arch-linux2-c-opt/tests/eps/tests/test23.o CC installed-arch-linux2-c-opt/tests/eps/tests/test24.o CC installed-arch-linux2-c-opt/tests/eps/tests/test25.o CC installed-arch-linux2-c-opt/tests/eps/tests/test26.o CC installed-arch-linux2-c-opt/tests/eps/tests/test27.o CC installed-arch-linux2-c-opt/tests/eps/tests/test28.o CC installed-arch-linux2-c-opt/tests/eps/tests/test29.o CC installed-arch-linux2-c-opt/tests/eps/tests/test3.o CC installed-arch-linux2-c-opt/tests/eps/tests/test30.o CC installed-arch-linux2-c-opt/tests/eps/tests/test31.o CC installed-arch-linux2-c-opt/tests/eps/tests/test32.o CC installed-arch-linux2-c-opt/tests/eps/tests/test37.o CC installed-arch-linux2-c-opt/tests/eps/tests/test38.o CC installed-arch-linux2-c-opt/tests/eps/tests/test39.o CC installed-arch-linux2-c-opt/tests/eps/tests/test4.o CC installed-arch-linux2-c-opt/tests/eps/tests/test40.o CC installed-arch-linux2-c-opt/tests/eps/tests/test44.o CC installed-arch-linux2-c-opt/tests/eps/tests/test5.o CC installed-arch-linux2-c-opt/tests/eps/tests/test6.o CC installed-arch-linux2-c-opt/tests/eps/tests/test8.o CC installed-arch-linux2-c-opt/tests/eps/tests/test9.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex10.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex11.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex12.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex13.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex18.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex19.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex2.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex24.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex25.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex29.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex3.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex30.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex31.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex34.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex35.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex36.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex4.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex41.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex43.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex44.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex46.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex47.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex49.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex5.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex55.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex56.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex57.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex7.o CC installed-arch-linux2-c-opt/tests/eps/tutorials/ex9.o FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test14f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test15f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test17f FLINKER installed-arch-linux2-c-opt/tests/eps/tests/test7f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex10f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex1f FLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex6f CC installed-arch-linux2-c-opt/tests/svd/tests/test1.o CC installed-arch-linux2-c-opt/tests/svd/tests/test10.o CC installed-arch-linux2-c-opt/tests/svd/tests/test11.o CC installed-arch-linux2-c-opt/tests/svd/tests/test12.o CC installed-arch-linux2-c-opt/tests/svd/tests/test14.o CC installed-arch-linux2-c-opt/tests/svd/tests/test15.o CC installed-arch-linux2-c-opt/tests/svd/tests/test16.o CC installed-arch-linux2-c-opt/tests/svd/tests/test18.o CC installed-arch-linux2-c-opt/tests/svd/tests/test19.o CC installed-arch-linux2-c-opt/tests/svd/tests/test2.o CC installed-arch-linux2-c-opt/tests/svd/tests/test20.o CC installed-arch-linux2-c-opt/tests/svd/tests/test3.o CC installed-arch-linux2-c-opt/tests/svd/tests/test4.o CC installed-arch-linux2-c-opt/tests/svd/tests/test5.o CC installed-arch-linux2-c-opt/tests/svd/tests/test6.o CC installed-arch-linux2-c-opt/tests/svd/tests/test7.o CC installed-arch-linux2-c-opt/tests/svd/tests/test8.o CC installed-arch-linux2-c-opt/tests/svd/tests/test9.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex14.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex15.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex45.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex48.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex51.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex52.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex53.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/ex8.o CC installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/network.o FLINKER installed-arch-linux2-c-opt/tests/svd/tests/test4f FLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex15f CC installed-arch-linux2-c-opt/tests/pep/tests/test1.o CC installed-arch-linux2-c-opt/tests/pep/tests/test10.o CC installed-arch-linux2-c-opt/tests/pep/tests/test11.o CC installed-arch-linux2-c-opt/tests/pep/tests/test12.o CC installed-arch-linux2-c-opt/tests/pep/tests/test2.o CC installed-arch-linux2-c-opt/tests/pep/tests/test3.o CC installed-arch-linux2-c-opt/tests/pep/tests/test4.o CC installed-arch-linux2-c-opt/tests/pep/tests/test5.o CC installed-arch-linux2-c-opt/tests/pep/tests/test6.o CC installed-arch-linux2-c-opt/tests/pep/tests/test7.o CC installed-arch-linux2-c-opt/tests/pep/tests/test8.o CC installed-arch-linux2-c-opt/tests/pep/tests/test9.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex16.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex17.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex28.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex38.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex40.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/ex50.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_1d.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_2d.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/butterfly.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/damped_beam.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/loaded_string.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/planar_waveguide.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/sleeper.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/spring.o CC installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/wiresaw.o FLINKER installed-arch-linux2-c-opt/tests/pep/tests/test3f FLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex16f CC installed-arch-linux2-c-opt/tests/nep/tests/test1.o CC installed-arch-linux2-c-opt/tests/nep/tests/test10.o CC installed-arch-linux2-c-opt/tests/nep/tests/test12.o CC installed-arch-linux2-c-opt/tests/nep/tests/test13.o CC installed-arch-linux2-c-opt/tests/nep/tests/test14.o CC installed-arch-linux2-c-opt/tests/nep/tests/test15.o CC installed-arch-linux2-c-opt/tests/nep/tests/test16.o CC installed-arch-linux2-c-opt/tests/nep/tests/test17.o CC installed-arch-linux2-c-opt/tests/nep/tests/test2.o CC installed-arch-linux2-c-opt/tests/nep/tests/test3.o CC installed-arch-linux2-c-opt/tests/nep/tests/test4.o CC installed-arch-linux2-c-opt/tests/nep/tests/test5.o CC installed-arch-linux2-c-opt/tests/nep/tests/test6.o CC installed-arch-linux2-c-opt/tests/nep/tests/test7.o CC installed-arch-linux2-c-opt/tests/nep/tests/test8.o CC installed-arch-linux2-c-opt/tests/nep/tests/test9.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex20.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex21.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex22.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex27.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/ex42.o CC installed-arch-linux2-c-opt/tests/nep/tutorials/nlevp/loaded_string.o FLINKER installed-arch-linux2-c-opt/tests/nep/tests/test2f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex20f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex22f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex27f FLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex54f CC installed-arch-linux2-c-opt/tests/mfn/tests/test1.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test2.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test3.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test4.o CC installed-arch-linux2-c-opt/tests/mfn/tests/test5.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex26.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex37.o CC installed-arch-linux2-c-opt/tests/mfn/tutorials/ex39.o FLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test3f FLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23f CC installed-arch-linux2-c-opt/tests/lme/tests/test1.o CC installed-arch-linux2-c-opt/tests/lme/tests/test2.o CC installed-arch-linux2-c-opt/tests/lme/tutorials/ex32.o CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/bv/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test21 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test22 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test23 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test24 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test25 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test26 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test27 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/ds/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/fn/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/rg/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/sys/classes/st/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/sys/mat/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/sys/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/sys/tutorials/ex33 CLINKER installed-arch-linux2-c-opt/tests/sys/vec/tests/test1 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-vecs.counts not ok sys_classes_bv_tests-test1f_1_bv_type-vecs # Error code: 14 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11260] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbcbc7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11263] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11263] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-contiguous.counts not ok sys_classes_bv_tests-test1f_1_bv_type-contiguous # Error code: 14 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11291] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fac363000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11294] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11294] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-contiguous # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-svec.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_1_bv_type-mat.counts not ok sys_classes_bv_tests-test1f_1_bv_type-svec # Error code: 14 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11321] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9599000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1f_1_bv_type-svec # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-contiguous.counts not ok sys_classes_bv_tests-test1f_1_bv_type-mat # Error code: 14 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11344] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f84fda000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11355] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11355] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1f_1_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-svec.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1f_2_bv_type-mat.counts not ok sys_classes_bv_tests-test1f_2_bv_type-vecs # Error code: 14 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f98884000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11366] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11362] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11362@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test1f_2_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test14f_1.counts # retrying sys_classes_bv_tests-test1f_2_bv_type-contiguous not ok sys_classes_bv_tests-test1f_2_bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test1f_2_bv_type-mat # Error code: 14 not ok sys_classes_ds_tests-test14f_1 # Error code: 14 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fab929000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11420] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11429] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11428] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11429] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11420@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c435000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11414] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:11424] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-11414@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1f_2_bv_type-mat # SKIP Command failed so no diff ok sys_classes_bv_tests-test1f_2_bv_type-svec # SKIP Command failed so no diff # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11465] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9ea47000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11468] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11468] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test14f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test1f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1f_1.counts not ok sys_classes_rg_tests-test1f_1 # Error code: 14 not ok sys_classes_fn_tests-test7f_1+fn_method-0 # Error code: 14 not ok sys_classes_fn_tests-test1f_1 # Error code: 14 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11543] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f99cf5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11553] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11553] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test1f_1 # SKIP Command failed so no diff # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11545] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fafd38000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11554] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11554] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11544] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9630000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11552] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11552] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test1f_1 # SKIP Command failed so no diff ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test10 not ok sys_classes_fn_tests-test7f_1+fn_method-1 # Error code: 14 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11597] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb821c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11617] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11617] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7f_1+fn_method-2 # Error code: 14 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f85c49000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11634] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11634] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7f_1+fn_method-3 # Error code: 14 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb8666000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11651] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11651] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test14 not ok sys_classes_bv_tests-test1f_2_bv_type-contiguous # Error code: 14 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11687] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb8129000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11690] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11690] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:11687] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test1f_2_bv_type-contiguous # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test21 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test22 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test23 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test24 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test25 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test26 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test27 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test28 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test29 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test30 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test31 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test32 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test37 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test38 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test39 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test40 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test44 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/eps/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex10 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex11 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex12 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex13 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex18 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex19 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex2 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex24 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex25 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex29 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex3 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex30 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex31 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex34 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex35 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex36 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex4 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex41 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex43 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex44 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex46 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex47 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex49 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex5 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex55 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex56 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex57 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex7 CLINKER installed-arch-linux2-c-opt/tests/eps/tutorials/ex9 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test14f_1.counts not ok eps_tests-test14f_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test15f_1.counts # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11952] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f971ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11955] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11955] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test14f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17f_1.counts not ok eps_tests-test15f_1 # Error code: 14 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:11983] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9db3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:11993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:11993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test15f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test7f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10f_1_sinvert.counts not ok eps_tests-test7f_1 # Error code: 14 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12028] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c30a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test7f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10f_1_shell.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex1f_1.counts not ok eps_tutorials-ex10f_1_sinvert # Error code: 14 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12058] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9287f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12069] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12069] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex10f_1_sinvert # SKIP Command failed so no diff not ok eps_tutorials-ex10f_1_shell+eps_two_sided-0 # Error code: 14 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa8640000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12072] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12072] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex10f_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex6f_1.counts not ok eps_tutorials-ex1f_1 # Error code: 14 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12088] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9aa8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12118] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12118] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex1f_1 # SKIP Command failed so no diff not ok eps_tutorials-ex10f_1_shell+eps_two_sided-1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex6f_1_ts.counts # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12116] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c335000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12127] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12127] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10f_1_shell # SKIP Command failed so no diff not ok eps_tutorials-ex6f_1 # Error code: 14 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test1 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12124] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fafc8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex6f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test10 not ok eps_tutorials-ex6f_1_ts # Error code: 14 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf5fb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12206] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12206] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex6f_1_ts # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test18 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test19 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test20 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/svd/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex14 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex15 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex45 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex48 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex51 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex52 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex53 CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/ex8 CC installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/embedgsvd.o TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15f_1.counts not ok svd_tests-test4f_1+svd_type-lanczos # Error code: 14 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12333] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa2725000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12336] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12336] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test1 not ok svd_tutorials-ex15f_1 # Error code: 14 not ok svd_tests-test4f_1+svd_type-trlanczos # Error code: 14 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12358] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb4660000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12356] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9bfc0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12363] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12363] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4f_1 # SKIP Command failed so no diff ok svd_tutorials-ex15f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test10 not ok svd_tests-test4f_1+svd_type-cross # Error code: 14 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12395] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb115d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12408] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12408] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4f_1 # SKIP Command failed so no diff not ok svd_tests-test4f_1+svd_type-cyclic # Error code: 14 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12422] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f91f3d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4f_1 # SKIP Command failed so no diff not ok svd_tests-test4f_1+svd_type-randomized # Error code: 14 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12439] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83ce3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test11 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/pep/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex16 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex17 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex28 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex38 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex40 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/ex50 CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_1d CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/acoustic_wave_2d CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/butterfly CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/damped_beam CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/loaded_string CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/planar_waveguide CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/sleeper CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/spring CLINKER installed-arch-linux2-c-opt/tests/pep/tutorials/nlevp/wiresaw TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test3f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16f_1.counts not ok pep_tests-test3f_1 # Error code: 14 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f82f85000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12578] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12578] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test3f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test1 not ok pep_tutorials-ex16f_1 # Error code: 14 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12580] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa3caf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex16f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test10 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test12 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test13 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test14 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test15 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test16 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test17 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test6 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test7 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test8 CLINKER installed-arch-linux2-c-opt/tests/nep/tests/test9 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex20 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex21 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex22 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex27 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/ex42 CLINKER installed-arch-linux2-c-opt/tests/nep/tutorials/nlevp/loaded_string TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test2f_1.counts not ok nep_tests-test2f_1 # Error code: 14 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faef79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12720] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12720] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test2f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22f_1.counts not ok nep_tutorials-ex20f_1 # Error code: 14 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12747] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9f5ec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12750] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12750] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex20f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27f_1.counts not ok nep_tutorials-ex22f_1 # Error code: 14 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12759] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9f3b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12769] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12769] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27f_2.counts not ok nep_tutorials-ex27f_1 # Error code: 14 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12788] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f880d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12791] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12791] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27f_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex54f_1_slp.counts not ok nep_tutorials-ex27f_2 # Error code: 14 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12820] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9690c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12833] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12833] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27f_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex54f_1_nleigs.counts not ok nep_tutorials-ex54f_1_slp # Error code: 14 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12848] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f88b9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex54f_1_slp # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test1 not ok nep_tutorials-ex54f_1_nleigs # Error code: 14 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12880] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8c856000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12889] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12889] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex54f_1_nleigs # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test2 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test3 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test4 CLINKER installed-arch-linux2-c-opt/tests/mfn/tests/test5 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex23 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex26 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex37 CLINKER installed-arch-linux2-c-opt/tests/mfn/tutorials/ex39 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3f_1.counts TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex23f_1.counts not ok mfn_tests-test3f_1 # Error code: 14 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12963] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa980a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:12966] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:12966] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok mfn_tests-test3f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/lme/tests/test1 CLINKER installed-arch-linux2-c-opt/tests/lme/tests/test2 not ok mfn_tutorials-ex23f_1 # Error code: 14 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:12986] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9293c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13000] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13000] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok mfn_tutorials-ex23f_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/lme/tutorials/ex32 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-contiguous.counts not ok sys_classes_bv_tests-test1_1_bv_type-vecs # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-svec.counts # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbb312000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_1_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_1_bv_type-mat.counts not ok sys_classes_bv_tests-test1_1_bv_type-contiguous # Error code: 14 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb8387000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13078] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13078] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_1_bv_type-contiguous # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-vecs.counts not ok sys_classes_bv_tests-test1_1_bv_type-svec # Error code: 14 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13077] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd0b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13087] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13087] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_1_bv_type-svec # SKIP Command failed so no diff not ok sys_classes_bv_tests-test1_1_bv_type-mat # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-contiguous.counts # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13084] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbb4bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13090] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13090] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_1_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-svec.counts not ok sys_classes_bv_tests-test1_2_bv_type-vecs # Error code: 14 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13144] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f96c8c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13169] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13169] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test1_2_bv_type-vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test1_2_bv_type-mat.counts not ok sys_classes_bv_tests-test1_2_bv_type-contiguous # Error code: 14 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13167] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa045f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13177] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13177] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_2_bv_type-contiguous # SKIP Command failed so no diff not ok sys_classes_bv_tests-test1_2_bv_type-svec # Error code: 14 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab0b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13180] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13180] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_2_bv_type-svec # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test10_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_1.counts not ok sys_classes_bv_tests-test1_2_bv_type-mat # Error code: 14 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13236] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb402a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13260] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13260] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test1_2_bv_type-mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_4.counts # retrying sys_classes_bv_tests-test10_1+bv_type-vecs not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f924b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13311] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13311] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13307@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-chol # Error code: 14 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa6805000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13330] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13333] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13334] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13334] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13333] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13330@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f83364000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13350] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13354] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13353] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13353] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13354] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13350@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faa8c8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13370] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13374] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13373] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13374] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13373] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13370@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f90f05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13393] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13390] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13394] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13393] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13394] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13390@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbe4a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13410] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13413] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13413] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13410@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8aa06000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13430] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13433] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13434] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13434] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13433] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13430@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-chol # Error code: 14 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8f08d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13450] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13450@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-svec_bv_orthog_block-svqb # Error code: 14 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb54af000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13473] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13470@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-gs # Error code: 14 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8702c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13490] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13493] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13493] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13490@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f880a6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13510] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13514] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13513] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13514] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13513] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13510@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-vecs # Error code: 14 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb68a1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13539] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13542] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13543] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13543] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13542] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13539@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_4+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f99c7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13533] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13533] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13530@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_6.counts not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9fdd6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13588] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13593] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13594] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13593] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13594] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13588@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-contiguous # Error code: 14 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa1e4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13562] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13563] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13562] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13563] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13559@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-chol # Error code: 14 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbc1a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13612] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13615] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13615] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13612@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-svec # Error code: 14 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13632] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbc9a6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13648] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13647] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13648] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13647] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13632@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f91d14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13652] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13646] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13651] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13652] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13651] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13646@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faa903000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13696] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13695] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13695] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13696] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13692@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9ab7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13712] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13715] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13716] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13716] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13715] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13712@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test10_1+bv_type-mat # Error code: 14 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb8f21000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13672] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13676] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13675] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13676] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13675] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13672@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_9.counts not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb23a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13759] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13734] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13758] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13758] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13734@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-vecs_bv_orthog_block-gs # Error code: 14 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa5e22000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13765] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13761] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13764] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13765] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13764] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13761@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa9729000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13801] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13801] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13785@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-vecs_bv_orthog_block-chol # Error code: 14 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8448f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13805] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13805] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13797@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-tsqr # Error code: 14 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13827] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f994e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13842] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13844] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13844] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13842] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13827@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-vecs_bv_orthog_block-svqb # Error code: 14 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb9518000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13837] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13845] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13843] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13845] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13843] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13837@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-contiguous_bv_orthog_block-gs # Error code: 14 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9585c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13885] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13888] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13889] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13889] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13888] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13885@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9348c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13865] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13869] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:13869] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13865@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-contiguous_bv_orthog_block-chol # Error code: 14 not ok sys_classes_bv_tests-test11_6+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc966000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13922] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13917@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13917] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9adfb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13907] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13925] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13907@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:13907] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-contiguous_bv_orthog_block-svqb # Error code: 14 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb5a5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13956] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13962] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13964] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13962] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13964] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13956@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13985] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbef82000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13989] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13988] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13988] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13989] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13985@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-gs # Error code: 14 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9aeab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:13957] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:13963] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:13965] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13965] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:13963] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-13957@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-svec_bv_orthog_block-chol # Error code: 14 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14005] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f94f24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14021] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14005@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:14005] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-chol # Error code: 14 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f94d90000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14019] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14024] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14024] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14019@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-svec_bv_orthog_block-svqb # Error code: 14 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14045] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f951b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14045@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-mat_bv_orthog_block-gs # Error code: 14 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f94831000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14085] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14088] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14089] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14089] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14088] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14085@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb0f48000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14057] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14065] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14065] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14057@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb807f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14105] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14113] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14113] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14105@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae4ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14125] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14125] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14121@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-svec_bv_orthog_block-svqb # Error code: 14 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbcc92000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14165] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14158] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14164] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:14165] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14164] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14158@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-gs # Error code: 14 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f7fa4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14189] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14189] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14185@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_9+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14145] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9f7a9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14161] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14160] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14160] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14161] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14145@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_11.counts not ok sys_classes_bv_tests-test11_11+bv_type-vecs # Error code: 14 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1718000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14237] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14238] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14241] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14237] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14240] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14243] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14238] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14239] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14241] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14234@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-chol # Error code: 14 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3face48000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14230] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14207] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14231] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14230] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14207@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-tsqr # Error code: 14 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f88f02000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14311] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14311] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14307@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-contiguous # Error code: 14 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14272] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8aa06000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14276] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14276] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14272] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14272] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14272] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-svec # Error code: 14 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faabb8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14350] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14346] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14349] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14343] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14347] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14350] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14351] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14347] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14346] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14343@1,4] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9df90000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14335] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14327] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14336] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14335] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14327@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_11+bv_type-mat # Error code: 14 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8dcb3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14379] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14386] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:14386] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14379@1,4] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_11 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test11_12.counts not ok sys_classes_bv_tests-test11_6+bv_type-mat_bv_orthog_block-svqb # Error code: 14 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb7af5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14427] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14427] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14407@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test11_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test12_1.counts not ok sys_classes_bv_tests-test11_12+bv_type-vecs # Error code: 14 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14436] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8f23f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14443] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14443] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:14436] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14436] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14436] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-contiguous # Error code: 14 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f87518000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14504] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14507] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14505] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14503] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14509] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14506] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14507] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14504] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14503] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14505] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14494@1,4] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-vecs # Error code: 14 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14500] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb23cc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14512] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14512] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-svec # Error code: 14 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2a86000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14556] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14558] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14556] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14559] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14542] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14562] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14558] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14542@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-contiguous # Error code: 14 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14552] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f94d23000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14563] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14563] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-svec # Error code: 14 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14601] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8fa23000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14606] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14606] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test12_1+bv_type-mat # Error code: 14 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14641] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb1fa3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14644] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14644] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test12_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test11_12+bv_type-mat # Error code: 14 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14593] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbe674000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14608] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14608] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:14593] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14593] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:14593] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test13_1.counts ok sys_classes_bv_tests-test11_12 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test14_1.counts not ok sys_classes_bv_tests-test13_1+bv_type-vecs # Error code: 14 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14690] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f82aa7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14699] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14699] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test14_1+bv_type-vecs # Error code: 14 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14696] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f932c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14702] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14702] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14696@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-contiguous # Error code: 14 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14725] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb6065000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14736] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14736] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! not ok sys_classes_bv_tests-test14_1+bv_type-contiguous # Error code: 14 ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14733] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f976c4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14739] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14739] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14733@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-svec # Error code: 14 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14768] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa06dd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test13_1+bv_type-mat # Error code: 14 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14795] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba7de000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test14_2.counts not ok sys_classes_bv_tests-test14_2 # Error code: 14 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14825] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa6160000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14828] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14828] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test15_1.counts not ok sys_classes_bv_tests-test14_1+bv_type-svec # Error code: 14 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9eefe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14770] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14776] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14777] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14776] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14777] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14770@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test14_1+bv_type-mat # Error code: 14 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb5afd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14874] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14875] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14875] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14874] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14871@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test15_2.counts not ok sys_classes_bv_tests-test15_1+bv_type-vecs # Error code: 14 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3facae9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14855] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14868] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14870] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14868] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14870] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14855@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-contiguous # Error code: 14 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb4d3a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14932] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14931] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14932] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14928@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-vecs # Error code: 14 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83d4c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14908] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14911] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14912] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14912] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14911] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14908@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-svec # Error code: 14 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8900b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14948] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14952] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14953] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14953] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no*** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14952] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14948@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-contiguous # Error code: 14 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8118a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14968] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14964] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14967] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14968] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14967] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14964@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_1+bv_type-mat # Error code: 14 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:14987] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8bb13000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:14991] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:14992] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14991] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:14992] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-14987@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test16_1.counts not ok sys_classes_bv_tests-test15_2+bv_type-svec # Error code: 14 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7232000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15012] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15011] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15011] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15012] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15008@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-vecs # Error code: 14 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15044] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb2197000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15056] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15057] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15057] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15056] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15044@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test15_2+bv_type-mat # Error code: 14 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f814bd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15053] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15053@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test15_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test17_1.counts not ok sys_classes_bv_tests-test17_1+bv_type-vecs # Error code: 14 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa1dbd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15113] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15113] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-contiguous # Error code: 14 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbd585000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15081] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15090] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15093] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15090] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15093] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15081@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_1+bv_type-contiguous # Error code: 14 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15131] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faecdc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15134] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15134] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_1+bv_type-svec # Error code: 14 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15160] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fba63c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15167] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15167] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_1+bv_type-mat # Error code: 14 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb0baa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test17_2.counts not ok sys_classes_bv_tests-test16_1+bv_type-svec # Error code: 14 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9d83c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15166] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15165] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15165] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15166] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15149@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-vecs # Error code: 14 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15218] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8fc31000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15230] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15230] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test16_1+bv_type-mat # Error code: 14 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f7fc72000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15227] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15233] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15233] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15227@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test18_1.counts not ok sys_classes_bv_tests-test17_2+bv_type-contiguous # Error code: 14 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15257] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa9d04000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15274] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15274] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-vecs_nsize-1 # Error code: 14 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15280] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa6160000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15283] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15283] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-svec # Error code: 14 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15298] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa683a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15304] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test17_2+bv_type-mat # Error code: 14 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15335] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8373f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15339] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15339] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test17_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test19_1.counts not ok sys_classes_bv_tests-test18_1+bv_type-vecs_nsize-2 # Error code: 14 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9fedf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15318] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15318] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15314@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f96437000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-contiguous_nsize-1 # Error code: 14 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15381] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb7270000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15384] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15384] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-contiguous_nsize-2 # Error code: 14 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f90df2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15420] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15419] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15420] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15419] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15416@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb7889000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15403] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15398] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15404] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15403] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15398@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15440] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9c7e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15443] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15443] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-mat_nsize-1 # Error code: 14 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15461] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f80746000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-0_bv_type-mat_nsize-2 # Error code: 14 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fba4b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15498] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15497] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15498] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15497] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15494@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15469] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa8e74000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15476] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15469@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-svec_nsize-1 # Error code: 14 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa5a1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15529] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15529] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-mat_nsize-1 # Error code: 14 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb811e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15532] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15532] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-svec_nsize-2 # Error code: 14 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa2de0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15564] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15548] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15563] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15564] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15563] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15548@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-mat_nsize-1 # Error code: 14 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15588] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa2132000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15591] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15591] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test18_1+bv_type-mat_nsize-2 # Error code: 14 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15560] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f974be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15568] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:15568] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:15560] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15560@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_1.counts not ok sys_classes_bv_tests-test2_1+bv_type-vecs # Error code: 14 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15634] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa310c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15637] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15637] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test19_1+nc-2_bv_type-mat_nsize-2 # Error code: 14 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa8c3d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15612] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15614] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15612] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15614] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15605@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test19_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_1+bv_type-contiguous # Error code: 14 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15655] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa9c99000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15658] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15658] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_2.counts not ok sys_classes_bv_tests-test2_1+bv_type-svec # Error code: 14 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15690] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8252f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15700] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15700] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_2+bv_type-vecs # Error code: 14 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15697] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f96a75000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_1+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test2_2+bv_type-contiguous # Error code: 14 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f912ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15737] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15737] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15719] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f90392000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15735] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15735] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test2_3.counts not ok sys_classes_bv_tests-test2_2+bv_type-svec # Error code: 14 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15764] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f96a20000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15781] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15781] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_3+bv_type-vecs # Error code: 14 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15778] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9aec4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15784] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15784] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_2+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test2_3+bv_type-contiguous # Error code: 14 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15800] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9100f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15815] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15815] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_2 # SKIP Command failed so no diff # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15812] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb1e02000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15818] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15818] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_1.counts not ok sys_classes_bv_tests-test2_3+bv_type-svec # Error code: 14 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15847] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f99400000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15862] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15862] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_1+bv_type-vecs # Error code: 14 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15859] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9f54d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15865] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15865] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test2_3+bv_type-mat # Error code: 14 not ok sys_classes_bv_tests-test3_1+bv_type-contiguous # Error code: 14 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa0bd7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15893] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faba3b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15898] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15898] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test2_3 # SKIP Command failed so no diff ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_1_svec_vecs.counts not ok sys_classes_bv_tests-test3_1+bv_type-svec # Error code: 14 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15927] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fabbe3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15943] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15943] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_1_svec_vecs # Error code: 14 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15940] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8eb1c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test3_1_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_2.counts not ok sys_classes_bv_tests-test3_1+bv_type-mat # Error code: 14 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15962] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f92dcd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test3_3.counts not ok sys_classes_bv_tests-test3_2+bv_type-vecs # Error code: 14 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:15990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9448f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:15994] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:15993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:15994] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-15990@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-vecs # Error code: 14 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbaa94000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16044] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16042] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16042] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16044] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16035@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-contiguous # Error code: 14 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa4cdf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16069] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16068] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16068] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16069] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16065@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_2+bv_type-contiguous # Error code: 14 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb41e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16045] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16045] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16043] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16037@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_2+bv_type-svec # Error code: 14 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa451a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16101] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16104] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16105] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16104] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16105] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16101@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-svec # Error code: 14 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f91877000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16085] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16089] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16088] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16089] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16088] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16085@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test3_3+bv_type-mat # Error code: 14 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7b55000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16145] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16148] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16149] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16148] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16149] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16145@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test3_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_1.counts not ok sys_classes_bv_tests-test3_2+bv_type-mat # Error code: 14 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9bdc5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16128] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16129] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16129] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16128] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16125@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_1_vecs_vmip.counts not ok sys_classes_bv_tests-test4_1+bv_type-vecs # Error code: 14 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16178] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f837a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16198] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16198] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_1_vecs_vmip # Error code: 14 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16206] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8c4ba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16209] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16209] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test4_1_vecs_vmip # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test4_2.counts not ok sys_classes_bv_tests-test4_1+bv_type-contiguous # Error code: 14 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16223] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb2d35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16228] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16228] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-vecs # Error code: 14 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16253] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5bd2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16256] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16256] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_1+bv_type-svec # Error code: 14 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16270] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa720f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16273] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16273] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-contiguous # Error code: 14 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16287] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fabace000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16290] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16290] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_1+bv_type-mat # Error code: 14 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16304] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f828dc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test5_1.counts not ok sys_classes_bv_tests-test4_2+bv_type-svec # Error code: 14 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16321] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9bebb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-vecs # Error code: 14 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16353] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f944ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16360] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16360] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test4_2+bv_type-mat # Error code: 14 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16368] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb7cdf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16371] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16371] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test5_2.counts not ok sys_classes_bv_tests-test5_1+bv_type-contiguous # Error code: 14 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16387] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fad2c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16395] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16395] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_2+bv_type-vecs # Error code: 14 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16415] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa8ff8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16418] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16418] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-svec # Error code: 14 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16432] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f058000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16449] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16449] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_2+bv_type-contiguous # Error code: 14 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16448] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8d5b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16452] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16452] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_1+bv_type-mat # Error code: 14 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16468] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1dde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16483] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16483] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test5_2+bv_type-svec # Error code: 14 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16480] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb57ee000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16486] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16486] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_1.counts not ok sys_classes_bv_tests-test5_2+bv_type-mat # Error code: 14 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16519] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9b6b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16530] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16530] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_2.counts not ok sys_classes_bv_tests-test6_1+bv_type-vecs # Error code: 14 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8e679000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16533] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16533] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-contiguous # Error code: 14 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa9748000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16577] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16577] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-vecs # Error code: 14 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16574] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9c29c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16581] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16581] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16580] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16574@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-svec # Error code: 14 not ok sys_classes_bv_tests-test6_2+bv_type-contiguous # Error code: 14 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16605] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f87542000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16618] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16618] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb3b7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16617] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16617] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16611@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_1+bv_type-mat # Error code: 14 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa9a69000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16654] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16654] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test6_3.counts not ok sys_classes_bv_tests-test6_3+bv_type-vecs # Error code: 14 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16686] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb6112000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16689] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16689] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-svec # Error code: 14 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbd07d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16648] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16653] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16655] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16653] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16648@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-contiguous # Error code: 14 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16703] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8eceb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16718] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16718] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_3+bv_type-svec # Error code: 14 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16740] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa8040000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16743] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16743] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test6_2+bv_type-mat # Error code: 14 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f88530000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16722] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16722] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16717@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test6_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_1.counts not ok sys_classes_bv_tests-test6_3+bv_type-mat # Error code: 14 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16757] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbc07a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16760] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16760] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test6_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_1_mat.counts not ok sys_classes_bv_tests-test7_1+bv_type-vecs # Error code: 14 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9e2b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16788] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16788] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-vecs # Error code: 14 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16815] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb18ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16820] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16820] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1+bv_type-contiguous # Error code: 14 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16832] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa897a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16835] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16835] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-contiguous # Error code: 14 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16849] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f013000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16854] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16854] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1+bv_type-svec # Error code: 14 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16866] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8a2ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16869] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16869] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1_mat+bv_type-svec # Error code: 14 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7b14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16888] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16888] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_1+bv_type-mat # Error code: 14 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16900] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f94a80000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16903] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16903] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_2.counts not ok sys_classes_bv_tests-test7_1_mat+bv_type-mat # Error code: 14 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9fb3c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16939] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16939] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_1_mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_2_mat.counts not ok sys_classes_bv_tests-test7_2+bv_type-vecs # Error code: 14 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f86b33000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16951] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16947] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16950] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16951] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16947@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:16947] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2+bv_type-contiguous # Error code: 14 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb1727000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17002] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17005] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17005] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17002@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2+bv_type-svec # Error code: 14 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f824af000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17026] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17022] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17026] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17022@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-vecs # Error code: 14 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa8654000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:16982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:16985] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:16986] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:16986] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:16985] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-16982@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-contiguous # Error code: 14 not ok sys_classes_bv_tests-test7_2+bv_type-mat # Error code: 14 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbb94b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17054@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb6c7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17059] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17042] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17057] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17059] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17057] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17042@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test7_3.counts not ok sys_classes_bv_tests-test7_2_mat+bv_type-svec # Error code: 14 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17090] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9d6c7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17110] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17110] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17090@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-vecs_bv_matmult-vecs # Error code: 14 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f82a08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17115] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17114] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17115] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17114] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17108@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-vecs_bv_matmult-mat # Error code: 14 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9d5d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17154] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17155] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17155] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17154] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17149@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-contiguous_bv_matmult-vecs # Error code: 14 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf86e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17179] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17178] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17179] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17178] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17175@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_2_mat+bv_type-mat # Error code: 14 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17135] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8dc84000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17150] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17151] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17150] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17151] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17135@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_2_mat # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_1.counts not ok sys_classes_bv_tests-test8_1+bv_type-vecs # Error code: 14 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17224] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa0ab8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17227] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17227] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-contiguous # Error code: 14 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17245] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8628e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17248] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17248] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-svec # Error code: 14 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f93e11000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17265] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17265] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-contiguous_bv_matmult-mat # Error code: 14 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17197] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faf3a1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17222] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17222] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17197@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_1+bv_type-mat # Error code: 14 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17279] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f806e6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17282] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17282] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_2.counts not ok sys_classes_bv_tests-test8_2+bv_type-vecs # Error code: 14 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17329] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c084000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17332] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17332] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-svec_bv_matmult-vecs # Error code: 14 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa65c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17294] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17297] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17298] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17297] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17298] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17294@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-contiguous # Error code: 14 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17346] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f814c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17349] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17349] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-svec # Error code: 14 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17379] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb1705000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-svec_bv_matmult-mat # Error code: 14 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faafad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17361] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17367] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17361@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_2+bv_type-mat # Error code: 14 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17400] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb809f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17403] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17403] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test8_3.counts not ok sys_classes_bv_tests-test8_3+bv_type-vecs # Error code: 14 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17446] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9824c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17449] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17449] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-contiguous # Error code: 14 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17467] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fab900000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17470] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17470] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test7_3+bv_type-mat_bv_matmult-vecs # Error code: 14 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17417] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fad810000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17444] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17444] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17417@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-svec # Error code: 14 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17484] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f86d39000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17499] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17499] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test8_3+bv_type-mat # Error code: 14 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17521] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb4478000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17524] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17524] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test8_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_1.counts not ok sys_classes_bv_tests-test7_3+bv_type-mat_bv_matmult-mat # Error code: 14 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a8d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17502] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17503] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17503] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17502] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17496@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test7_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_1_svec_vecs.counts not ok sys_classes_bv_tests-test9_1+bv_type-vecs # Error code: 14 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17551] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f99da4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17569] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17569] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_1_svec_vecs # Error code: 14 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17579] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7a99000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17582] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17582] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_2.counts not ok sys_classes_bv_tests-test9_1+bv_type-contiguous # Error code: 14 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17596] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb5baf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17601] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17601] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_1+bv_type-svec # Error code: 14 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17644] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba4b7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17647] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17647] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_2+bv_type-vecs # Error code: 14 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9ae7e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17626] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17631] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17630] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17630] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17631] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17626@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_1+bv_type-mat # Error code: 14 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17665] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8f6ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_bv_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_bv_tests-test9_2_svec_vecs.counts not ok sys_classes_bv_tests-test9_2+bv_type-contiguous # Error code: 14 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb378a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17691] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17691] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17682@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff not ok sys_classes_bv_tests-test9_2_svec_vecs # Error code: 14 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9af67000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17711] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17714] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17715] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17715] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17714] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17711@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2_svec_vecs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test1_1.counts not ok sys_classes_bv_tests-test9_2+bv_type-svec # Error code: 14 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa68b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17738] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17739] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17739] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:17738] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17731@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test1_1 # Error code: 14 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17768] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f87273000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17771] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17771] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test1_2.counts not ok sys_classes_bv_tests-test9_2+bv_type-mat # Error code: 14 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f98a66000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:17788] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:17789] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17788] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17789] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-17783@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_bv_tests-test9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test12_1.counts not ok sys_classes_ds_tests-test1_2 # Error code: 14 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17814] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb032c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17819] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17819] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test13_1.counts not ok sys_classes_ds_tests-test12_1 # Error code: 14 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17846] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f85551000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test13_2.counts not ok sys_classes_ds_tests-test13_1 # Error code: 14 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17876] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fab37b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17879] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17879] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test15_1.counts not ok sys_classes_ds_tests-test13_2 # Error code: 14 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f86c70000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17911] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17911] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test13_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test16_1.counts not ok sys_classes_ds_tests-test15_1 # Error code: 14 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17936] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f92689000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17940] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17940] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test17_1.counts not ok sys_classes_ds_tests-test16_1 # Error code: 14 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17966] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f85245000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:17971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:17971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test18_1.counts not ok sys_classes_ds_tests-test17_1 # Error code: 14 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:17996] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f814ba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18001] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18001] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test18_2.counts not ok sys_classes_ds_tests-test18_1+nsize-1 # Error code: 14 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18026] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa8792000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_2+nsize-1 # Error code: 14 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc158000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_2+nsize-2 # Error code: 14 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbae6a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18091] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18098] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18099] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18098] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18099] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18091@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_1+nsize-2 # Error code: 14 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbbf1b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18076] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18076] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18073@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test18_2+nsize-3 # Error code: 14 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9b62a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18115] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18118] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18120] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18119] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18118] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18120] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18119] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18115@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test18_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test19_1.counts not ok sys_classes_ds_tests-test19_1 # Error code: 14 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18168] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa185c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18177] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18177] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test2_1.counts not ok sys_classes_ds_tests-test18_1+nsize-3 # Error code: 14 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f94f6b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18141] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18143] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18132] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18142] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18141] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18143] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18132@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test2_2.counts not ok sys_classes_ds_tests-test2_1+ds_method-0 # Error code: 14 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18204] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f978e9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18207] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18207] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_2+ds_method-0 # Error code: 14 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18234] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f94bea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18240] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18240] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_1+ds_method-1 # Error code: 14 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18249] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb1cba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18252] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18252] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_2+ds_method-1 # Error code: 14 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18267] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa15f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18273] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18273] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test2_1+ds_method-2 # Error code: 14 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18283] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb01ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18286] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18286] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test20_1.counts not ok sys_classes_ds_tests-test2_2+ds_method-2 # Error code: 14 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18300] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb6f62000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18305] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18305] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test21_1.counts not ok sys_classes_ds_tests-test20_1 # Error code: 14 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18330] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fac772000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18334] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18334] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test21_2.counts not ok sys_classes_ds_tests-test21_1+nsize-1 # Error code: 14 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18360] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb0a5b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_2+nsize-1 # Error code: 14 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18390] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9868f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18394] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18394] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_1+nsize-2 # Error code: 14 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fade7b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18410] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18411] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18411] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18410] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18407@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_2+nsize-2 # Error code: 14 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faca72000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18425] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18433] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18425@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test21_1+nsize-3 # Error code: 14 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8dbb8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:18449] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test21_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_1.counts not ok sys_classes_ds_tests-test21_2+nsize-3 # Error code: 14 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f88b31000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18473] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18475] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18466] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:18474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18475] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-18466@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_ds_tests-test21_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_2.counts not ok sys_classes_ds_tests-test22_1 # Error code: 14 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18503] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb9f46000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18511] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18511] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test22_2_extrarow.counts not ok sys_classes_ds_tests-test22_2 # Error code: 14 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18536] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9de4a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18539] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18539] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test23_1.counts not ok sys_classes_ds_tests-test22_2_extrarow # Error code: 14 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18567] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9e58e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18571] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18571] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test22_2_extrarow # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test24_1.counts not ok sys_classes_ds_tests-test23_1 # Error code: 14 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18596] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8fd9a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test23_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test24_1_extrarow.counts not ok sys_classes_ds_tests-test24_1 # Error code: 14 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18627] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa114f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18633] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18633] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test25_1.counts not ok sys_classes_ds_tests-test24_1_extrarow # Error code: 14 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18656] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9c103000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18659] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18659] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test24_1_extrarow # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_1.counts not ok sys_classes_ds_tests-test25_1 # Error code: 14 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18687] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa53c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18692] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18692] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test25_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_2.counts not ok sys_classes_ds_tests-test26_1 # Error code: 14 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18716] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb3513000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18719] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18719] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test26_3.counts not ok sys_classes_ds_tests-test26_2 # Error code: 14 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18747] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f952ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18752] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18752] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test27_1.counts not ok sys_classes_ds_tests-test26_3+reorthog-0 # Error code: 14 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18776] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbcb7b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18779] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18779] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_3 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test27_1 # Error code: 14 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18806] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb1eec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18811] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18811] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test27_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test3_1.counts not ok sys_classes_ds_tests-test26_3+reorthog-1 # Error code: 14 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18823] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9fab6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18826] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18826] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test26_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test3_2.counts not ok sys_classes_ds_tests-test3_1+ds_method-0 # Error code: 14 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18855] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb66f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18871] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18871] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_2+ds_method-0 # Error code: 14 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9608b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18886] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18886] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_1+ds_method-1 # Error code: 14 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18900] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2e32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18905] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18905] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_2+ds_method-1 # Error code: 14 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f95d8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18920] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18920] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test3_1+ds_method-2 # Error code: 14 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fad5ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18940] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18940] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test4_1.counts not ok sys_classes_ds_tests-test3_2+ds_method-2 # Error code: 14 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:18951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb563b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:18954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:18954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test5_1.counts not ok sys_classes_ds_tests-test4_1 # Error code: 14 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19001] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f92303000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19011] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19011] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test5_2.counts not ok sys_classes_ds_tests-test5_1+ds_method-0 # Error code: 14 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f917b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19014] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19014] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test5_1+ds_method-1 # Error code: 14 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19052] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f81794000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19060] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19060] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! not ok sys_classes_ds_tests-test5_2+ds_method-0 # Error code: 14 ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19055] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f82a4a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test5_2+ds_method-1 # Error code: 14 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19089] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbc5ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19092] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19092] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # not ok sys_classes_ds_tests-test5_1+ds_method-2 # Error code: 14 ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19086] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb8070000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19095] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19095] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test6_1.counts not ok sys_classes_ds_tests-test5_2+ds_method-2 # Error code: 14 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19119] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbd60d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19139] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19139] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test6_2.counts not ok sys_classes_ds_tests-test6_1+ds_method-0 # Error code: 14 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19137] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fadf59000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19142] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19142] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_1+ds_method-1 # Error code: 14 not ok sys_classes_ds_tests-test6_2+ds_method-0 # Error code: 14 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19179] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f85c34000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19186] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19186] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fae8dd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19189] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19189] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_2+ds_method-1 # Error code: 14 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19217] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa2dd4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test6_1+ds_method-2 # Error code: 14 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19215] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f84a08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19223] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19223] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test7_1.counts not ok sys_classes_ds_tests-test6_2+ds_method-2 # Error code: 14 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19243] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95a63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test6_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test7_2.counts not ok sys_classes_ds_tests-test7_1+ds_method-0 # Error code: 14 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19266] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa214b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19270] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19270] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test7_2+ds_method-0 # Error code: 14 not ok sys_classes_ds_tests-test7_1+ds_method-1 # Error code: 14 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19305] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fab404000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19315] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19315] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19311] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f869dd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test7_1 # SKIP Command failed so no diff ok sys_classes_ds_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_1.counts not ok sys_classes_ds_tests-test7_2+ds_method-1 # Error code: 14 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19344] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fba6df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19359] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19359] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_2.counts not ok sys_classes_ds_tests-test8_1+ds_method-0 # Error code: 14 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19361] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbba94000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test8_1+ds_method-1 # Error code: 14 not ok sys_classes_ds_tests-test8_2+ds_method-0 # Error code: 14 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19405] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbc56f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19411] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19411] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19402] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbd7db000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19409] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19409] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_2 # SKIP Command failed so no diff ok sys_classes_ds_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test8_3.counts not ok sys_classes_ds_tests-test8_2+ds_method-1 # Error code: 14 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19438] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa1b45000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19453] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19453] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_ds_tests-test9_1.counts not ok sys_classes_ds_tests-test8_3+ds_method-0 # Error code: 14 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19455] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa16cc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19458] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19458] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_ds_tests-test8_3 # SKIP Command failed so no diff not ok sys_classes_ds_tests-test8_3+ds_method-1 # Error code: 14 not ok sys_classes_ds_tests-test9_1 # Error code: 14 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19499] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f900e3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19505] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19505] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19498] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f95262000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19504] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19504] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_ds_tests-test8_3 # SKIP Command failed so no diff ok sys_classes_ds_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test1_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test10_1.counts not ok sys_classes_fn_tests-test1_1 # Error code: 14 not ok sys_classes_fn_tests-test10_1 # Error code: 14 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8562c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19565] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19565] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test10_1 # SKIP Command failed so no diff # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f826ea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19564] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19564] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test11_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test11_2.counts not ok sys_classes_fn_tests-test11_1 # Error code: 14 not ok sys_classes_fn_tests-test11_2 # Error code: 14 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19618] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbccec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19624] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19624] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test11_1 # SKIP Command failed so no diff # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7060000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19625] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19625] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test12_1.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test12_1_rational.counts not ok sys_classes_fn_tests-test12_1+fn_type-exp # Error code: 14 not ok sys_classes_fn_tests-test12_1_rational # Error code: 14 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19678] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb213f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19684] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19684] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test12_1 # SKIP Command failed so no diff # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19679] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f831fb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19685] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19685] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test12_1_rational # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test13_1.counts not ok sys_classes_fn_tests-test12_1+fn_type-sqrt # Error code: 14 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19712] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8de12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19727] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19727] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test13_1_triang.counts not ok sys_classes_fn_tests-test13_1 # Error code: 14 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19729] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f839ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19732] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19732] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test2_1.counts not ok sys_classes_fn_tests-test13_1_triang # Error code: 14 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb369e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19787] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19787] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test13_1_triang # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_1.counts not ok sys_classes_fn_tests-test2_1 # Error code: 14 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19789] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f87f0f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19792] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19792] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_1_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_1+fn_method-0 # Error code: 14 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19832] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae238000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19847] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19847] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_1_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19849] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f0ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_1_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_1+fn_method-1 # Error code: 14 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19868] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9ad81000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19882] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19882] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_2.counts not ok sys_classes_fn_tests-test3_1_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8f5a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19886] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19886] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_1_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_2_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_2+fn_method-0 # Error code: 14 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa2730000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19942] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19942] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_2_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19943] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9550c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19946] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19946] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_2_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_2+fn_method-1 # Error code: 14 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19962] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f92a0d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19976] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19976] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_3.counts not ok sys_classes_fn_tests-test3_2_subdiagonalpade+fn_method-3 # Error code: 14 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:19977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb6e05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:19980] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:19980] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_2_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_3_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_3+fn_method-0 # Error code: 14 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20026] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9680c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20037] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20037] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_3_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f887bc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20040] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20040] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_3_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_3+fn_method-1 # Error code: 14 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa0ae0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_3 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_3_subdiagonalpade+fn_method-3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_4.counts # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20069] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8ae41000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20074] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20074] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_3_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_4_subdiagonalpade.counts not ok sys_classes_fn_tests-test3_4+fn_method-0 # Error code: 14 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8fa5c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_4 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_4_subdiagonalpade+fn_method-2 # Error code: 14 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20128] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9b382000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20134] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20134] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_4_subdiagonalpade # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_4+fn_method-1 # Error code: 14 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20152] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb200d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20165] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20165] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_4 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_4_subdiagonalpade+fn_method-3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_5.counts # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20162] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f84c6a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20168] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20168] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test3_4_subdiagonalpade # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test3_6.counts not ok sys_classes_fn_tests-test3_5+fn_method-2 # Error code: 14 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20215] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fba8d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20225] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20225] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_5 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_6+fn_method-2 # Error code: 14 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20222] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8aa9a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20228] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20228] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_6 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_5+fn_method-3 # Error code: 14 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20245] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb34fd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20259] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20259] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test3_5 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test3_6+fn_method-3 # Error code: 14 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fae175000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20262] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20262] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test4_1.counts ok sys_classes_fn_tests-test3_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test5_1.counts not ok sys_classes_fn_tests-test4_1 # Error code: 14 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20310] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbefc9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20319] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20319] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test5_1 # Error code: 14 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20316] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa423a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test5_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test6_1.counts not ok sys_classes_fn_tests-test5_2 # Error code: 14 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20370] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9574a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20379] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20379] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test5_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test6_1 # Error code: 14 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20376] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9533b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test6_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_1.counts not ok sys_classes_fn_tests-test6_2 # Error code: 14 not ok sys_classes_fn_tests-test7_1+fn_method-0 # Error code: 14 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20431] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f96a83000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20439] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20439] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test6_2 # SKIP Command failed so no diff # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20436] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa6449000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_1_sadeghi.counts not ok sys_classes_fn_tests-test7_1+fn_method-1 # Error code: 14 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20471] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa4ac5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20486] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20486] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7_1_sadeghi # Error code: 14 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f836f0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20489] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20489] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1_sadeghi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_2.counts not ok sys_classes_fn_tests-test7_1+fn_method-2 # Error code: 14 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa3938000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20530] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20530] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_2_sadeghi.counts not ok sys_classes_fn_tests-test7_2+fn_method-0 # Error code: 14 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20533] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f80b98000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20536] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20536] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7_2_sadeghi # Error code: 14 not ok sys_classes_fn_tests-test7_2+fn_method-1 # Error code: 14 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fae3f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c039000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20582] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20582] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test7_2_sadeghi # SKIP Command failed so no diff ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_3.counts not ok sys_classes_fn_tests-test7_2+fn_method-2 # Error code: 14 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83190000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20627] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20627] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test7_3_inplace.counts not ok sys_classes_fn_tests-test7_3 # Error code: 14 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5044000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20624] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20632] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20631] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20630] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20631] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20632] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20630] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-20624@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_fn_tests-test7_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test8_1.counts not ok sys_classes_fn_tests-test8_1+fn_method-0 # Error code: 14 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20690] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7398000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20698] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20698] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_1+fn_method-1 # Error code: 14 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbc396000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test7_3_inplace # Error code: 14 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa5193000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:20693] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:20695] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20693] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20695] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-20683@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_classes_fn_tests-test7_3_inplace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test8_2.counts not ok sys_classes_fn_tests-test8_1+fn_method-2 # Error code: 14 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20735] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fba021000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20756] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20756] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_2+fn_method-0 # Error code: 14 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20763] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f85854000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20766] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20766] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test8_1+fn_method-3 # Error code: 14 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20780] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fabcc4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20796] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20796] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_fn_tests-test9_1.counts not ok sys_classes_fn_tests-test8_2+fn_method-1 # Error code: 14 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9ed0e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_fn_tests-test9_1 # Error code: 14 not ok sys_classes_fn_tests-test8_2+fn_method-2 # Error code: 14 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20840] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fafb28000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_fn_tests-test9_1 # SKIP Command failed so no diff # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20841] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb0f3a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20847] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20847] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1_1.counts not ok sys_classes_fn_tests-test8_2+fn_method-3 # Error code: 14 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20875] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8d822000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20891] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20891] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_fn_tests-test8_2 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test1_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test1_2.counts # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20888] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb4c7c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20894] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20894] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_1.counts not ok sys_classes_rg_tests-test1_2 # Error code: 14 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20941] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa3a78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20951] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20951] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test1_2 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test2_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_2.counts # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:20948] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa47c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:20954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:20954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_3.counts not ok sys_classes_rg_tests-test2_2 # Error code: 14 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21001] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb7736000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21011] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21011] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test2_2 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test2_3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_4.counts # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8b6ee000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21014] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21014] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test2_5.counts not ok sys_classes_rg_tests-test2_4 # Error code: 14 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21061] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7d64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test2_4 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test2_5 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1.counts # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21068] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9acce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21074] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21074] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test2_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_ellipse.counts not ok sys_classes_rg_tests-test3_1 # Error code: 14 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f98f33000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_1_ellipse # Error code: 14 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21128] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8377e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21134] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21134] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_rg_tests-test3_1_ellipse # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_interval.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_1_ring.counts not ok sys_classes_rg_tests-test3_1_interval # Error code: 14 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21182] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9ec57000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21191] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21191] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1_interval # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_1_ring # Error code: 14 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21188] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8aa67000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21194] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21194] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_1_ring # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_2_interval.counts not ok sys_classes_rg_tests-test3_2 # Error code: 14 not ok sys_classes_rg_tests-test3_2_interval # Error code: 14 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb5c44000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21251] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21251] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f859cf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21254] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21254] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_2_interval # SKIP Command failed so no diff ok sys_classes_rg_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_ellipse.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_interval.counts not ok sys_classes_rg_tests-test3_4_ellipse # Error code: 14 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21307] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faf33a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21313] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21313] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_4_ellipse # SKIP Command failed so no diff not ok sys_classes_rg_tests-test3_4_interval # Error code: 14 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21308] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb1988000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21314] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21314] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_rg_tests-test3_4_interval # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_rg_tests-test3_4_ring.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test1_1.counts not ok sys_classes_rg_tests-test3_4_ring # Error code: 14 not ok sys_classes_st_tests-test1_1+st_matmode-inplace # Error code: 14 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21365] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f86824000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21368] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f96e51000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21374] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21374] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test1_1 # SKIP Command failed so no diff ok sys_classes_rg_tests-test3_4_ring # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test2_1.counts not ok sys_classes_st_tests-test1_1+st_matmode-shell # Error code: 14 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa7fdd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21416] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21416] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test3_1.counts not ok sys_classes_st_tests-test2_1+st_matmode-copy # Error code: 14 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21418] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa0c49000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21421] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21421] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test3_1+st_matmode-copy # Error code: 14 not ok sys_classes_st_tests-test2_1+st_matmode-inplace # Error code: 14 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21462] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f93996000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21467] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21467] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21460] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa8620000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21468] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21468] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test3_1+st_matmode-inplace # Error code: 14 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21496] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f83ac0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21499] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21499] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test2_1+st_matmode-shell # Error code: 14 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21495] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbb13a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21502] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21502] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test4_1.counts not ok sys_classes_st_tests-test3_1+st_matmode-shell # Error code: 14 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21524] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb555d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21546] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21546] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test4_2.counts not ok sys_classes_st_tests-test4_1+st_matmode-copy # Error code: 14 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21544] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f82efe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21549] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21549] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test4_1+st_matmode-shell # Error code: 14 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21586] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7b14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21593] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21593] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test4_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test4_2+st_matmode-copy # Error code: 14 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21590] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f95011000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21596] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21596] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test5_1.counts not ok sys_classes_st_tests-test4_2+st_matmode-shell # Error code: 14 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f91d62000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21640] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21640] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test5_1_shell.counts not ok sys_classes_st_tests-test5_1+st_matmode-copy # Error code: 14 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb51a6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21643] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21643] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test5_1+st_matmode-inplace # Error code: 14 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21679] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa721a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21687] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21687] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test5_1 # SKIP Command failed so no diff not ok sys_classes_st_tests-test5_1_shell # Error code: 14 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21684] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3face93000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21690] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21690] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test5_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-copy.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-inplace.counts not ok sys_classes_st_tests-test6_1_st_matmode-copy # Error code: 14 not ok sys_classes_st_tests-test6_1_st_matmode-inplace # Error code: 14 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21742] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d982000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21749] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21749] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21744] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb295a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21750] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21750] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test6_1_st_matmode-copy # SKIP Command failed so no diff ok sys_classes_st_tests-test6_1_st_matmode-inplace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test6_1_st_matmode-shell.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-cayley.counts not ok sys_classes_st_tests-test7_1_st_type-cayley # Error code: 14 not ok sys_classes_st_tests-test6_1_st_matmode-shell # Error code: 14 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21803] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9dd5c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21809] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21809] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21804] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f434000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21810] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21810] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test6_1_st_matmode-shell # SKIP Command failed so no diff ok sys_classes_st_tests-test7_1_st_type-cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-shift.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test7_1_st_type-sinvert.counts not ok sys_classes_st_tests-test7_1_st_type-shift # Error code: 14 not ok sys_classes_st_tests-test7_1_st_type-sinvert # Error code: 14 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21863] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9b69c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21869] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21869] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test7_1_st_type-shift # SKIP Command failed so no diff # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21864] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a26b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21870] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21870] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_classes_st_tests-test7_1_st_type-sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-cayley.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-shift.counts not ok sys_classes_st_tests-test8_1_st_type-cayley # Error code: 14 not ok sys_classes_st_tests-test8_1_st_type-shift # Error code: 14 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21923] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb34d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21930] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21930] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21924] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9b527000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21929] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21929] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test8_1_st_type-cayley # SKIP Command failed so no diff ok sys_classes_st_tests-test8_1_st_type-shift # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test8_1_st_type-sinvert.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test9_1_st_type-shift.counts not ok sys_classes_st_tests-test9_1_st_type-shift # Error code: 14 not ok sys_classes_st_tests-test8_1_st_type-sinvert # Error code: 14 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21983] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa2a97000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21990] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21990] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:21984] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faa1f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:21989] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:21989] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test9_1_st_type-shift # SKIP Command failed so no diff ok sys_classes_st_tests-test8_1_st_type-sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_classes_st_tests-test9_1_st_type-sinvert.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_mat_tests-test1_1.counts not ok sys_classes_st_tests-test9_1_st_type-sinvert # Error code: 14 not ok sys_mat_tests-test1_1 # Error code: 14 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22044] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9fef7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22050] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22050] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_classes_st_tests-test9_1_st_type-sinvert # SKIP Command failed so no diff # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22043] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb9514000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22049] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22049] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_mat_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_mat_tests-test1_2.counts TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test1_1.counts not ok sys_tests-test1_1 # Error code: 14 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22103] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9ef6d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22109] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22109] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test3_arpack.counts not ok sys_tests-test3_arpack # Error code: 14 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22142] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f89380000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22145] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22145] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tests-test3_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test3_no-primme.counts not ok sys_mat_tests-test1_2 # Error code: 14 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22104] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8ac03000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22110] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22110] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22104@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok sys_mat_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test4_1.counts not ok sys_tests-test3_no-primme # Error code: 14 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22187] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faa5b7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22200] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22200] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tests-test3_no-primme # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tests-test4_2.counts not ok sys_tests-test4_1 # Error code: 14 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22199] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb816c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22203] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22203] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_tutorials-ex33_1.counts not ok sys_tests-test4_2 # Error code: 14 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22250] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9357b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22260] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22260] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok sys_tests-test4_2 # SKIP Command failed so no diff not ok sys_tutorials-ex33_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/sys_vec_tests-test1_1.counts # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22257] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f97941000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22263] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22263] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_tutorials-ex33_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/sys_vec_tests-test1_2.counts not ok sys_vec_tests-test1_1 # Error code: 14 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22310] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f83b0c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22320] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22320] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok sys_vec_tests-test1_1 # SKIP Command failed so no diff not ok sys_vec_tests-test1_2 # Error code: 14 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22317] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa31b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22317@1,0] # Exit code: 14 # -------------------------------------------------------------------------- RM test-rm-sys.F90 ok sys_vec_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_subspace.counts not ok eps_tests-test1_1+eps_type-krylovschur # Error code: 14 not ok eps_tests-test1_1_subspace # Error code: 14 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22377] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9b3d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22384] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22384] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22381] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa4e0b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22387] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22387] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_nopurify.counts not ok eps_tests-test1_1+eps_type-arnoldi # Error code: 14 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22413] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f99877000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22428] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1_ks_nopurify # Error code: 14 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22431] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f96916000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22434] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22434] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_nopurify # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_trueres.counts not ok eps_tests-test1_1+eps_type-gd # Error code: 14 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22450] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f82f2e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1_ks_trueres # Error code: 14 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22478] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa53ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22481] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22481] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_trueres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_sinvert.counts not ok eps_tests-test1_1+eps_type-jd # Error code: 14 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22497] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a127000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22517] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22517] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff not ok eps_tests-test1_1_ks_sinvert # Error code: 14 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22525] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9325b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22528] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22528] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ks_cayley.counts not ok eps_tests-test1_1+eps_type-lapack # Error code: 14 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22542] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa5df7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22549] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22549] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_lanczos.counts not ok eps_tests-test1_1_ks_cayley # Error code: 14 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22572] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fadeb6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22575] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22575] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ks_cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_gd2.counts not ok eps_tests-test1_1_lanczos # Error code: 14 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22604] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb6887000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22611] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22611] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_gd_borth.counts not ok eps_tests-test1_1_gd2 # Error code: 14 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22632] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb71d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22635] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22635] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_jd_borth.counts not ok eps_tests-test1_1_gd_borth # Error code: 14 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f96e7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22671] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22671] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_gd_borth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_lobpcg.counts not ok eps_tests-test1_1_jd_borth # Error code: 14 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3facbde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22695] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22695] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_jd_borth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_cholesky.counts not ok eps_tests-test1_1_lobpcg # Error code: 14 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22724] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9d19e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22730] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22730] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_scalapack.counts not ok eps_tests-test1_1_cholesky # Error code: 14 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22752] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9c356000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22755] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22755] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_cholesky # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss.counts not ok eps_tests-test1_1_scalapack+nsize-1 # Error code: 14 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84e91000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22789] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22789] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff not ok eps_tests-test1_1_ciss+eps_ciss_extraction-ritz # Error code: 14 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22812] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f925f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22815] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22815] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_ciss # SKIP Command failed so no diff not ok eps_tests-test1_1_scalapack+nsize-2 # Error code: 14 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fad2ec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22833] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:22833] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22829@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff not ok eps_tests-test1_1_ciss+eps_ciss_extraction-hankel # Error code: 14 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22847] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbbe79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_ksps.counts not ok eps_tests-test1_1_ciss_ksps # Error code: 14 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f998a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22907] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22907] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss_ksps # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_gnhep.counts not ok eps_tests-test1_1_ciss_gnhep # Error code: 14 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f91579000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22937] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22937] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_1_ciss_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_1_ciss_trapezoidal.counts not ok eps_tests-test1_1_scalapack+nsize-3 # Error code: 14 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae077000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22872] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22866] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:22870] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:22869] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22869] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22872] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22870] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-22866@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2.counts not ok eps_tests-test1_1_ciss_trapezoidal # Error code: 14 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22964] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f89d5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22967] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22967] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_1_ciss_trapezoidal # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2_open.counts not ok eps_tests-test1_2 # Error code: 14 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:22993] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa19b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:22999] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:22999] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_2_parallel.counts not ok eps_tests-test1_2_open # Error code: 14 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23022] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb2dc4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_2_open # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_3.counts not ok eps_tests-test1_2_parallel # Error code: 14 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb5caf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23063] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23063] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23054@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_2_parallel # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_4.counts not ok eps_tests-test1_3 # Error code: 14 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23084] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb2f8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23093] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23093] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_5_rqcg.counts not ok eps_tests-test1_4+eps_power_shift_type-constant # Error code: 14 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f929f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_4 # SKIP Command failed so no diff not ok eps_tests-test1_5_rqcg # Error code: 14 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23148] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f85572000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23152] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23152] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_5_rqcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_5_lobpcg.counts not ok eps_tests-test1_4+eps_power_shift_type-rayleigh # Error code: 14 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23165] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f96fa9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23168] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23168] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_4 # SKIP Command failed so no diff not ok eps_tests-test1_5_lobpcg # Error code: 14 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23196] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb3516000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23202] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23202] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_5_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6.counts not ok eps_tests-test1_4+eps_power_shift_type-wilkinson # Error code: 14 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23212] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8d676000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23215] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23215] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6_lanczos.counts not ok eps_tests-test1_6+eps_type-krylovschur # Error code: 14 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23248] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa1839000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23265] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23265] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6 # SKIP Command failed so no diff not ok eps_tests-test1_6_lanczos # Error code: 14 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23272] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb09d7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23275] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23275] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_6_subspace.counts not ok eps_tests-test1_6+eps_type-arnoldi # Error code: 14 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23290] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb16b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23295] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23295] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test1_6 # SKIP Command failed so no diff not ok eps_tests-test1_6_subspace # Error code: 14 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23319] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa6592000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_6_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_ghep.counts not ok eps_tests-test1_6+eps_type-gd # Error code: 14 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23336] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa0ec1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23340] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23340] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test1_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_gnhep.counts not ok eps_tests-test1_9_ks_gnhep # Error code: 14 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba13a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23397] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23404] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23405] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23404] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:23405] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23397@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_9_ks_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_ks_ghiep.counts not ok eps_tests-test1_9_ks_ghep # Error code: 14 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf545000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23371] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23371] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23366@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_9_ks_ghep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_lobpcg_ghep.counts not ok eps_tests-test1_9_lobpcg_ghep # Error code: 14 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb66ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23467] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23471] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23470] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23470] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23471] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23467@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_9_lobpcg_ghep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test1_9_jd_gnhep.counts not ok eps_tests-test1_9_ks_ghiep # Error code: 14 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa65a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23438] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23437] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23438] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23437] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23434@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test1_9_ks_ghiep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1.counts not ok eps_tests-test10_1+eps_type-krylovschur # Error code: 14 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb7752000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23536] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23536] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test1_9_jd_gnhep # Error code: 14 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f921f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23500] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:23503] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:23504] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23504] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23503] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-23500@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test1_9_jd_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_lobpcg.counts not ok eps_tests-test10_1+eps_type-arnoldi # Error code: 14 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23550] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa6606000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23553] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23553] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test10_1_lobpcg # Error code: 14 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23580] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbda0e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23589] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23589] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_lanczos.counts not ok eps_tests-test10_1+eps_type-gd # Error code: 14 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23595] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f89145000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23598] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23598] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1 # SKIP Command failed so no diff not ok eps_tests-test10_1_lanczos # Error code: 14 not ok eps_tests-test10_1+eps_type-jd # Error code: 14 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23637] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f972a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23644] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23644] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb607e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23645] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23645] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1 # SKIP Command failed so no diff ok eps_tests-test10_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test10_1_gd2.counts not ok eps_tests-test10_1+eps_type-rqcg # Error code: 14 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23672] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f98298000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23689] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23689] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_1.counts not ok eps_tests-test10_1_gd2 # Error code: 14 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23686] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8c87b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23692] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23692] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test10_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_1_ks_cayley.counts not ok eps_tests-test11_1+eps_type-krylovschur # Error code: 14 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23739] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8b378000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23749] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23749] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff not ok eps_tests-test11_1_ks_cayley # Error code: 14 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23746] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f99016000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23752] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23752] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test11_1_ks_cayley # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test11_2.counts not ok eps_tests-test11_1+eps_type-arnoldi # Error code: 14 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23768] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1c58000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23789] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23789] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff not ok eps_tests-test11_2 # Error code: 14 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23796] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8ab3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23799] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23799] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1.counts not ok eps_tests-test11_1+eps_type-lapack # Error code: 14 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23814] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa64ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23840] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23840] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1_gd.counts not ok eps_tests-test12_1+eps_type-krylovschur # Error code: 14 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23843] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd443000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff not ok eps_tests-test12_1+eps_type-subspace # Error code: 14 not ok eps_tests-test12_1_gd+eps_type-gd # Error code: 14 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23880] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9e688000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23890] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23890] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23887] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb21e0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23893] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23893] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff ok eps_tests-test12_1_gd # SKIP Command failed so no diff not ok eps_tests-test12_1_gd+eps_type-jd # Error code: 14 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23921] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbb94b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1_gd # SKIP Command failed so no diff not ok eps_tests-test12_1+eps_type-arnoldi # Error code: 14 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23920] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8faaa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23927] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test12_1_gd2.counts not ok eps_tests-test12_1+eps_type-power # Error code: 14 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23960] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbd698000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test12_1 # SKIP Command failed so no diff not ok eps_tests-test12_1_gd2 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1.counts # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:23968] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb6854000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:23974] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:23974] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test12_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1_jd.counts not ok eps_tests-test13_1+eps_type-krylovschur # Error code: 14 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24021] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9bfa3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_1 # SKIP Command failed so no diff not ok eps_tests-test13_1_jd # Error code: 14 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24028] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8fbbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24034] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24034] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_1_gd2.counts not ok eps_tests-test13_1+eps_type-gd # Error code: 14 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24050] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9270f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_2.counts not ok eps_tests-test13_1_gd2 # Error code: 14 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24078] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f96804000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24081] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24081] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test13_2_gd2.counts not ok eps_tests-test13_2+eps_type-krylovschur # Error code: 14 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f95d7d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24133] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24133] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_2 # SKIP Command failed so no diff not ok eps_tests-test13_2_gd2 # Error code: 14 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24138] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa77d1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24141] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24141] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_2_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test14_1.counts not ok eps_tests-test13_2+eps_type-gd # Error code: 14 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24156] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fad383000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24162] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24162] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test13_2 # SKIP Command failed so no diff not ok eps_tests-test14_1 # Error code: 14 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24185] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb132e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24188] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24188] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test16_1.counts not ok eps_tests-test13_2+eps_type-jd # Error code: 14 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24202] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faf012000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24206] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24206] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test13_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17_1.counts not ok eps_tests-test16_1 # Error code: 14 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24232] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f99a34000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24237] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24237] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test17_2.counts not ok eps_tests-test17_1 # Error code: 14 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa2625000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24266] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24266] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24262@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_1_ks.counts not ok eps_tests-test17_2 # Error code: 14 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24293] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8988f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24300] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24300] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test17_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_1_ks_gnhep.counts not ok eps_tests-test18_1_ks # Error code: 14 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24325] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fae569000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24329] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24329] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_1_ks # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_2_gd.counts not ok eps_tests-test18_1_ks_gnhep # Error code: 14 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24355] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8ad40000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24360] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24360] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_1_ks_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test18_2_jd.counts not ok eps_tests-test18_2_gd # Error code: 14 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24385] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f981e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24389] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24389] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_2_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test19_1.counts not ok eps_tests-test18_2_jd # Error code: 14 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24416] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb8f07000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test18_2_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1.counts not ok eps_tests-test19_1 # Error code: 14 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24445] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f853f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24448] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24448] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_gd2.counts not ok eps_tests-test2_1+eps_type-arnoldi # Error code: 14 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24476] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f7fc45000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24480] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24480] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_gd2 # Error code: 14 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fad163000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_krylovschur.counts not ok eps_tests-test2_1+eps_type-gd # Error code: 14 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24522] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa37b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24525] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24525] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_krylovschur+eps_krylovschur_locking-0 # Error code: 14 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24552] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9a17c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24557] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24557] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_1_krylovschur # SKIP Command failed so no diff not ok eps_tests-test2_1+eps_type-jd # Error code: 14 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd9f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24572] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_1_krylovschur+eps_krylovschur_locking-1 # Error code: 14 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24586] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f841ec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24591] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24591] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1_krylovschur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_1_scalapack.counts not ok eps_tests-test2_1_scalapack # Error code: 14 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24635] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa022b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24638] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24638] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_2.counts not ok eps_tests-test2_1+eps_type-lapack # Error code: 14 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24603] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8f9b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24606] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24606] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_1 # SKIP Command failed so no diff not ok eps_tests-test2_2+eps_lanczos_reorthog-local # Error code: 14 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24677] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9515a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24680] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24680] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_2_selective.counts not ok eps_tests-test2_2_selective # Error code: 14 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24710] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb129f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24715] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24715] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2_selective # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3.counts not ok eps_tests-test2_2+eps_lanczos_reorthog-full # Error code: 14 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24694] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f82b9c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_2 # SKIP Command failed so no diff not ok eps_tests-test2_2+eps_lanczos_reorthog-periodic # Error code: 14 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24749] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7e79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24757] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:24757] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_2 # SKIP Command failed so no diff not ok eps_tests-test2_3+eps_type-krylovschur # Error code: 14 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f87db6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24760] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24754] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24761] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24760] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24761] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24754@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test2_3 # SKIP Command failed so no diff not ok eps_tests-test2_3+eps_type-lapack # Error code: 14 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24791] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa71f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24794] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24795] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24794] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24795] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24791@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test2_3 # SKIP Command failed so no diff not ok eps_tests-test2_2+eps_lanczos_reorthog-partial # Error code: 14 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8ea09000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3_gd.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test2_3_jd.counts not ok eps_tests-test2_3_jd # Error code: 14 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24854] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93bba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24860] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24860] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test2_3_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test20_1.counts not ok eps_tests-test20_1+eps_type-krylovschur # Error code: 14 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24892] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f830ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24895] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24895] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test20_1+eps_type-arnoldi # Error code: 14 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24909] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbbf5f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24912] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24912] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test2_3_gd # Error code: 14 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa67f5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24861] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24851] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:24859] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24861] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24859] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-24851@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test2_3_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test20_1_lanczos.counts not ok eps_tests-test20_1+eps_type-gd # Error code: 14 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24926] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9888a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24929] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24929] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test20_1_lanczos # Error code: 14 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab16b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24959] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24959] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test20_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test21_1.counts not ok eps_tests-test20_1+eps_type-jd # Error code: 14 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:24971] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f994c0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:24974] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:24974] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test21_1 # Error code: 14 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fab164000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25011] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25011] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test21_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_1.counts not ok eps_tests-test20_1+eps_type-rqcg # Error code: 14 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25018] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8670c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25021] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25021] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test22_1+eps_true_residual-0 # Error code: 14 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f88494000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25067] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25067] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # not ok eps_tests-test20_1+eps_type-lobpcg # Error code: 14 ok eps_tests-test22_1 # SKIP Command failed so no diff # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25062] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3faeaea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25068] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25068] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test20_1 # SKIP Command failed so no diff not ok eps_tests-test22_1+eps_true_residual-1 # Error code: 14 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25093] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa7ca2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25099] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25099] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_2.counts not ok eps_tests-test20_1+eps_type-lapack # Error code: 14 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25096] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faeaf3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25102] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25102] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test22_3.counts not ok eps_tests-test22_2 # Error code: 14 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f99b8d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25159] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25159] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_2 # SKIP Command failed so no diff not ok eps_tests-test22_3+bv_orthog_block-gs # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_1.counts # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25156] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f822cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25162] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25162] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test22_3+bv_orthog_block-tsqr # Error code: 14 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25197] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c286000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25206] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25206] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_1 # Error code: 14 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25203] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f87502000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25209] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25209] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test23_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_2.counts not ok eps_tests-test22_3+bv_orthog_block-chol # Error code: 14 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25229] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c69b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25253] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25253] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_2 # Error code: 14 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25252] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fbd3ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25256] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25256] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test23_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test23_3.counts not ok eps_tests-test22_3+bv_orthog_block-tsqrchol # Error code: 14 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25272] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8441d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25293] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25293] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test22_3 # SKIP Command failed so no diff not ok eps_tests-test23_3 # Error code: 14 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25300] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f81257000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25303] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25303] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test23_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test24_1.counts not ok eps_tests-test22_3+bv_orthog_block-svqb # Error code: 14 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25318] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fba203000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25328] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25328] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test22_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test25_1.counts not ok eps_tests-test24_1 # Error code: 14 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4ccf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25350] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25350] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test26_1.counts not ok eps_tests-test25_1 # Error code: 14 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25378] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f95f4a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25383] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25383] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test25_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test26_1_arpack.counts not ok eps_tests-test26_1+eps_true_residual-0_eps_two_sided-0 # Error code: 14 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8881f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25410] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25410] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test26_1_arpack # Error code: 14 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25438] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fada59000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test27_1.counts not ok eps_tests-test26_1+eps_true_residual-0_eps_two_sided-1 # Error code: 14 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25454] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9e915000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25457] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25457] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test27_1+eps_type-gd # Error code: 14 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25490] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9124c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test26_1+eps_true_residual-1_eps_two_sided-0 # Error code: 14 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25498] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa16b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25504] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25504] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test26_1 # SKIP Command failed so no diff not ok eps_tests-test27_1+eps_type-jd # Error code: 14 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25526] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8dcbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test26_1+eps_true_residual-1_eps_two_sided-1 # Error code: 14 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f896aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25538] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25538] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test27_2.counts not ok eps_tests-test27_1+eps_type-rqcg # Error code: 14 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25554] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8ec66000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25580] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test27_2+st_filter_type-filtlan # Error code: 14 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25582] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5e6c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25585] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25585] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test27_2 # SKIP Command failed so no diff not ok eps_tests-test27_1+eps_type-lobpcg # Error code: 14 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25604] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa00cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_1 # SKIP Command failed so no diff not ok eps_tests-test27_2+st_filter_type-chebyshev # Error code: 14 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25613] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f987d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25619] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25619] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test27_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1_jd.counts not ok eps_tests-test28_1+eps_type-krylovschur # Error code: 14 not ok eps_tests-test28_1_jd # Error code: 14 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25671] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8ac0c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25678] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25678] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_1 # SKIP Command failed so no diff # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25673] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb916d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25679] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25679] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_1_lanczos.counts not ok eps_tests-test28_1+eps_type-arnoldi # Error code: 14 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25706] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbe07c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_1_lanczos # Error code: 14 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25723] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8cac2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25726] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25726] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_2.counts not ok eps_tests-test28_1+eps_type-gd # Error code: 14 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25742] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9add8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25766] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25766] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_2+eps_type-power # Error code: 14 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25770] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb67dc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25773] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25773] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_2 # SKIP Command failed so no diff not ok eps_tests-test28_1+eps_type-rqcg # Error code: 14 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25789] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c3bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_2+eps_type-subspace # Error code: 14 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25802] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f899a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25807] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25807] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test28_3.counts not ok eps_tests-test28_1+eps_type-lobpcg # Error code: 14 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25823] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f82548000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_1 # SKIP Command failed so no diff not ok eps_tests-test28_3 # Error code: 14 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25851] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbf293000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25854] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25854] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test28_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1.counts not ok eps_tests-test28_1+eps_type-lapack # Error code: 14 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25869] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb8969000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25884] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25884] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test28_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_cmplxvals.counts not ok eps_tests-test29_1 # Error code: 14 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25898] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f84197000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25901] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25901] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test29_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_rqi.counts not ok eps_tests-test29_1_cmplxvals # Error code: 14 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25930] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fba58a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25945] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25945] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test29_1_cmplxvals # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_1_rqi_singular.counts not ok eps_tests-test29_1_rqi # Error code: 14 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25958] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3faaa05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:25961] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:25961] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test29_1_rqi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test29_3.counts not ok eps_tests-test29_1_rqi_singular # Error code: 14 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:25990] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa8497000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test29_1_rqi_singular # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1.counts not ok eps_tests-test29_3 # Error code: 14 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26018] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8bc77000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26021] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26021] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test29_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_lanczos.counts not ok eps_tests-test3_1+eps_type-krylovschur # Error code: 14 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26050] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f99b5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26066] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26066] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_lanczos # Error code: 14 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26078] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f89e8b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26081] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26081] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_power.counts not ok eps_tests-test3_1+eps_type-subspace # Error code: 14 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26095] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f829b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_power # Error code: 14 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbd5b6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26129] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26129] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_power # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_jd.counts not ok eps_tests-test3_1+eps_type-arnoldi # Error code: 14 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26142] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f99006000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26145] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26145] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1 # SKIP Command failed so no diff not ok eps_tests-test3_1_jd # Error code: 14 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8c703000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26181] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26181] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_gd.counts not ok eps_tests-test3_1+eps_type-lapack # Error code: 14 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26189] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f5cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26192] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26192] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_gd2.counts not ok eps_tests-test3_1_gd # Error code: 14 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26224] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa10bc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26242] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26242] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_arpack.counts not ok eps_tests-test3_1_gd2 # Error code: 14 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26249] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa13c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26252] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26252] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_1_scalapack.counts not ok eps_tests-test3_1_arpack # Error code: 14 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26285] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97a37000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_rqcg.counts not ok eps_tests-test3_1_scalapack # Error code: 14 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26309] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa95ca000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26312] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26312] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lobpcg.counts not ok eps_tests-test3_2_rqcg # Error code: 14 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26346] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f957bc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_2_rqcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lanczos.counts not ok eps_tests-test3_2_lobpcg # Error code: 14 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26369] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f839e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26372] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26372] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_2_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test3_2_lanczos_delayed.counts not ok eps_tests-test3_2_lanczos # Error code: 14 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9d750000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test3_2_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test30_1.counts not ok eps_tests-test3_2_lanczos_delayed # Error code: 14 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26429] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f661000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test3_2_lanczos_delayed # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_1.counts not ok eps_tests-test30_1 # Error code: 14 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26467] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9dce7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26485] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26485] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test30_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_2.counts not ok eps_tests-test31_1 # Error code: 14 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26489] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fba33e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26492] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26492] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_3.counts not ok eps_tests-test31_2 # Error code: 14 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26531] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9bfc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26546] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26546] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_4.counts not ok eps_tests-test31_3 # Error code: 14 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26549] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f84472000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26552] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26552] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test31_5.counts not ok eps_tests-test31_4+st_filter_damping-none # Error code: 14 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26592] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1f43000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26607] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26607] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test31_4 # SKIP Command failed so no diff not ok eps_tests-test31_5+st_filter_damping-lanczos # Error code: 14 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26609] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa5e77000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26612] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26612] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_5 # SKIP Command failed so no diff not ok eps_tests-test31_4+st_filter_damping-jackson # Error code: 14 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26628] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa012c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26636] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26636] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_1.counts not ok eps_tests-test31_5+st_filter_damping-fejer # Error code: 14 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26643] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa9de2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26646] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26646] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test31_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_2.counts not ok eps_tests-test32_1 # Error code: 14 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9bdc1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26700] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26700] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_3.counts not ok eps_tests-test32_2 # Error code: 14 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26703] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83b14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26706] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26706] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_3_gnhep.counts not ok eps_tests-test32_3+nsize-1 # Error code: 14 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9e170000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26759] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26759] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_3 # SKIP Command failed so no diff not ok eps_tests-test32_3_gnhep+nsize-1 # Error code: 14 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26763] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f89454000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26766] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26766] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test32_3_gnhep # SKIP Command failed so no diff not ok eps_tests-test32_3+nsize-4 # Error code: 14 not ok eps_tests-test32_3_gnhep+nsize-4 # Error code: 14 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb7174000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26799] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26797] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-26785@1,3] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_3 # SKIP Command failed so no diff # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95c63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26806] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26794] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26803] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26805] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26803] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26806] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26805] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-26794@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_3_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_4.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_4_gnhep.counts not ok eps_tests-test32_4_gnhep+nsize-1 # Error code: 14 not ok eps_tests-test32_4+nsize-1 # Error code: 14 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26872] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9fc58000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26878] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26878] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26870] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb3ae2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26877] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26877] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test32_4 # SKIP Command failed so no diff ok eps_tests-test32_4_gnhep # SKIP Command failed so no diff not ok eps_tests-test32_4+nsize-4 # Error code: 14 not ok eps_tests-test32_4_gnhep+nsize-4 # Error code: 14 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f929e0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26914] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26915] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26912] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26914] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:26918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26915] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-26906@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_4 # SKIP Command failed so no diff # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb483c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26913] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26916] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26905] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:26917] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:26911] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26917] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26911] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26916] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26913] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-26905@1,3] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_4_gnhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_redundant.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_mumps.counts not ok eps_tests-test32_5_redundant # Error code: 14 not ok eps_tests-test32_5_mumps # Error code: 14 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26984] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbd180000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26993] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26993] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:26984] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:26984] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:26983] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f98751000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:26992] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:26992] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:26983] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:26983] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok eps_tests-test32_5_redundant # SKIP Command failed so no diff ok eps_tests-test32_5_mumps # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test32_5_superlu.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test37_1.counts not ok eps_tests-test37_1 # Error code: 14 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27055] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9a351000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27061] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27061] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test37_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test38_1.counts not ok eps_tests-test38_1 # Error code: 14 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27097] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f912e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test38_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test39_1.counts not ok eps_tests-test32_5_superlu # Error code: 14 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab656000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27063] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27063] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27056@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test32_5_superlu # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test39_1_lanczos.counts not ok eps_tests-test39_1_lanczos # Error code: 14 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f96c66000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27160] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27164] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27163] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27164] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27163] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27160@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test39_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_1.counts not ok eps_tests-test39_1+eps_type-krylovschur # Error code: 14 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27127] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbdbbc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:27130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27127@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test39_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-krylovschur # Error code: 14 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27193] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb98f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27196] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27196] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-arnoldi # Error code: 14 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27216] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbda00000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27227] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27226] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27226] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27227] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and no-------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # t able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27216@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test39_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-subspace # Error code: 14 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27222] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa1374000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27229] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27229] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-lobpcg # Error code: 14 not ok eps_tests-test4_1+type-arnoldi # Error code: 14 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb0f4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27253] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27266] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27265] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27265] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27266] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27253@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27259] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faf7f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27264] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27264] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test39_1 # SKIP Command failed so no diff ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-lanczos # Error code: 14 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27296] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fac061000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27303] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27303] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-gd # Error code: 14 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27321] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa8d08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27324] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27324] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test39_1+eps_type-lapack # Error code: 14 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f95b3c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:27301] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:27302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27301] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-27295@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tests-test39_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_1_arpack.counts not ok eps_tests-test4_1+type-jd # Error code: 14 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27338] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f89c20000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27347] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27347] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_1_arpack # Error code: 14 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f89c2b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test4_2.counts not ok eps_tests-test4_1+type-gd2 # Error code: 14 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27383] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9b793000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27398] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27398] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_1 # SKIP Command failed so no diff not ok eps_tests-test4_2+type-rqcg # Error code: 14 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27413] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f88d31000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27416] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27416] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test4_2 # SKIP Command failed so no diff not ok eps_tests-test4_1+type-lapack # Error code: 14 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27430] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa22aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27438] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27438] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test40_1.counts not ok eps_tests-test4_2+type-lobpcg # Error code: 14 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27447] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8ca6a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27450] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27450] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test4_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test44_1_real.counts not ok eps_tests-test40_1 # Error code: 14 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27484] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f98813000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test40_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1.counts ok eps_tests-test5_1 # SKIP Requires DATAFILESPATH not ok eps_tests-test44_1_real+eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27507] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa1705000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27510] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27510] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test44_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_power.counts ok eps_tests-test5_1_power # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_jd.counts ok eps_tests-test5_1_jd # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_gd.counts ok eps_tests-test5_1_gd # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_1_gd2.counts ok eps_tests-test5_1_gd2 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test5_2_arpack.counts ok eps_tests-test5_2_arpack # SKIP Requires DATAFILESPATH not ok eps_tests-test44_1_real+eps_krylovschur_bse_type-gruning # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1.counts # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27556] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f93aa9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27570] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27570] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test44_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_power.counts not ok eps_tests-test6_1+eps_type-krylovschur # Error code: 14 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27652] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f982ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27662] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27662] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test6_1 # SKIP Command failed so no diff not ok eps_tests-test6_1_power # Error code: 14 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27659] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9e8b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27665] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27665] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test6_1_power # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_gd2.counts not ok eps_tests-test6_1+eps_type-subspace # Error code: 14 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27684] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3faf866000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff not ok eps_tests-test6_1_gd2 # Error code: 14 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27709] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f914aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27712] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27712] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test6_1_arpack.counts not ok eps_tests-test6_1+eps_type-arnoldi # Error code: 14 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27727] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb7d0c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27754] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27754] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff not ok eps_tests-test6_1_arpack # Error code: 14 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27756] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9dd52000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27759] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27759] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1.counts not ok eps_tests-test6_1+eps_type-gd # Error code: 14 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27773] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbdec5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27797] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27797] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_lanczos.counts not ok eps_tests-test8_1+eps_type-power # Error code: 14 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27803] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f943e5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27806] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27806] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1 # SKIP Command failed so no diff not ok eps_tests-test8_1_lanczos # Error code: 14 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27838] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f84fde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27850] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27850] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_lanczos # SKIP Command failed so no diff not ok eps_tests-test8_1+eps_type-subspace # Error code: 14 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27847] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f80bba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27853] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27853] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_lapack.counts not ok eps_tests-test8_1+eps_type-arnoldi # Error code: 14 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27886] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f918aa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27897] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27897] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1 # SKIP Command failed so no diff not ok eps_tests-test8_1_lapack # Error code: 14 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27894] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fac2a6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27900] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27900] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_krylovschur_vecs.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_jd.counts not ok eps_tests-test8_1_krylovschur_vecs # Error code: 14 not ok eps_tests-test8_1_jd # Error code: 14 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f853ea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:27952] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb30d5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:27959] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:27959] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_1_krylovschur_vecs # SKIP Command failed so no diff ok eps_tests-test8_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_gd.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_1_gd2.counts not ok eps_tests-test8_1_gd2 # Error code: 14 not ok eps_tests-test8_1_gd # Error code: 14 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28013] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f88af8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28019] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28019] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1_gd2 # SKIP Command failed so no diff # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28014] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbbaa6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28020] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28020] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2_lanczos.counts not ok eps_tests-test8_2_lanczos # Error code: 14 not ok eps_tests-test8_2+eps_type-rqcg # Error code: 14 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28073] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb6bad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28080] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28080] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_2_lanczos # SKIP Command failed so no diff # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28074] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95544000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_2_arpack.counts not ok eps_tests-test8_2+eps_type-lobpcg # Error code: 14 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa969e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28122] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28122] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_rqcg.counts not ok eps_tests-test8_2_arpack # Error code: 14 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28124] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8a51b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28127] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28127] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_2_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_lanczos.counts not ok eps_tests-test8_3_rqcg # Error code: 14 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28169] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3facdc4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28183] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28183] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_3_rqcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test8_3_lobpcg.counts not ok eps_tests-test8_3_lanczos # Error code: 14 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28184] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae154000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28187] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28187] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test8_3_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1.counts not ok eps_tests-test8_3_lobpcg # Error code: 14 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28229] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa09db000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28243] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28243] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test8_3_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1_gd.counts not ok eps_tests-test9_1+eps_type-krylovschur # Error code: 14 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f90a1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28247] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28247] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_1 # SKIP Command failed so no diff not ok eps_tests-test9_1_gd # Error code: 14 not ok eps_tests-test9_1+eps_type-arnoldi # Error code: 14 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28286] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb2451000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28293] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28293] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28288] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa998c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28294] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28294] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1_gd # SKIP Command failed so no diff ok eps_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_1_gd2.counts not ok eps_tests-test9_1+eps_type-lapack # Error code: 14 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28322] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb5082000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28338] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28338] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_2.counts not ok eps_tests-test9_1_gd2 # Error code: 14 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28335] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa15d2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28341] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28341] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_1_gd2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_3.counts not ok eps_tests-test9_2+eps_balance-none_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28387] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9a41c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28398] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28398] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-none_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28420] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbc2b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28423] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28423] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_3+bv_orthog_refine-never # Error code: 14 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f816b8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28396] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:28401] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:28402] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28401] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28402] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-28396@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test9_3 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-oneside_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28437] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb6668000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28440] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28440] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_2+eps_balance-oneside_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb245c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_3+bv_orthog_refine-ifneeded # Error code: 14 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1162000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28452] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:28456] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:28455] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28455] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28456] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-28452@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tests-test9_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_4.counts not ok eps_tests-test9_2+eps_balance-twoside_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28491] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa347c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_2 # SKIP Command failed so no diff not ok eps_tests-test9_4 # Error code: 14 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28520] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9378f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28527] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28527] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_5.counts not ok eps_tests-test9_2+eps_balance-twoside_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28536] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9c139000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28539] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28539] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_5_arpack.counts not ok eps_tests-test9_5 # Error code: 14 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28568] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f96213000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28589] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28589] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6.counts not ok eps_tests-test9_5_arpack # Error code: 14 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28596] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8876a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_5_arpack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_hankel.counts not ok eps_tests-test9_6 # Error code: 14 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28628] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e94b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28649] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28649] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_cheby.counts not ok eps_tests-test9_6_hankel # Error code: 14 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28656] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa6501000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28659] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28659] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_hankel # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_hankel_cheby.counts not ok eps_tests-test9_6_cheby # Error code: 14 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28688] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb4a60000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28708] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28708] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_cheby # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_refine.counts not ok eps_tests-test9_6_hankel_cheby # Error code: 14 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28716] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8c323000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28719] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28719] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_6_hankel_cheby # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_bcgs.counts not ok eps_tests-test9_6_refine # Error code: 14 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28749] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3facd71000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28769] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28769] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_refine # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_6_cheby_interval.counts not ok eps_tests-test9_6_bcgs # Error code: 14 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28776] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1765000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28779] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28779] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_6_bcgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_7_real.counts not ok eps_tests-test9_6_cheby_interval # Error code: 14 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28808] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba5bf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28829] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28829] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_6_cheby_interval # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tests-test9_8.counts not ok eps_tests-test9_7_real # Error code: 14 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28836] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a0ab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28839] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28839] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tests-test9_7_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_sinvert.counts not ok eps_tests-test9_8 # Error code: 14 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28868] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9f6c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28888] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28888] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tests-test9_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_sinvert_twoside.counts not ok eps_tutorials-ex10_1_sinvert # Error code: 14 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28896] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f903b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28899] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28899] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_shell.counts not ok eps_tutorials-ex10_1_sinvert_twoside+set_ht-0 # Error code: 14 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f84e53000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28948] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28948] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_sinvert_twoside # SKIP Command failed so no diff not ok eps_tutorials-ex10_1_shell # Error code: 14 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28956] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa618d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28959] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28959] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_shell # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex10_1_shell_twoside.counts not ok eps_tutorials-ex10_1_sinvert_twoside+set_ht-1 # Error code: 14 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:28973] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb9e43000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:28978] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:28978] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_sinvert_twoside # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex11_1.counts not ok eps_tutorials-ex10_1_shell_twoside+set_ht-0 # Error code: 14 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29003] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f85d32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29006] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29006] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_shell_twoside # SKIP Command failed so no diff not ok eps_tutorials-ex11_1 # Error code: 14 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29033] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8b103000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex11_2.counts not ok eps_tutorials-ex10_1_shell_twoside+set_ht-1 # Error code: 14 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29050] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb440c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29053] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29053] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex10_1_shell_twoside # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex12_1.counts not ok eps_tutorials-ex11_2 # Error code: 14 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29082] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbcb5e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29100] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29100] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex11_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_1.counts not ok eps_tutorials-ex12_1 # Error code: 14 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c57e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29113] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29113] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_2.counts not ok eps_tutorials-ex13_1 # Error code: 14 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29142] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f89938000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29161] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29161] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_1 # SKIP Command failed so no diff not ok eps_tutorials-ex13_2+st_matstructure-different # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_3.counts # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29170] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f909b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_2 # SKIP Command failed so no diff not ok eps_tutorials-ex13_2+st_matstructure-subset # Error code: 14 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29208] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fac688000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29217] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29217] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_2 # SKIP Command failed so no diff not ok eps_tutorials-ex13_3 # Error code: 14 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29214] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab79e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29220] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29220] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_4.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex13_6.counts not ok eps_tutorials-ex13_4+eps_type-gd # Error code: 14 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29269] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f94b96000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29277] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:29277] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex13_4 # SKIP Command failed so no diff not ok eps_tutorials-ex13_4+eps_type-lobpcg # Error code: 14 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29299] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb0348000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_4 # SKIP Command failed so no diff not ok eps_tutorials-ex13_6 # Error code: 14 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9004c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29274] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:29281] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:29280] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29280] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29281] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-29274@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex18_1.counts not ok eps_tutorials-ex13_4+eps_type-rqcg # Error code: 14 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29316] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f95d89000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29323] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29323] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex13_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_1_krylovschur.counts not ok eps_tutorials-ex18_1 # Error code: 14 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29344] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8b1de000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29347] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29347] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_1_lobpcg.counts not ok eps_tutorials-ex19_1_krylovschur # Error code: 14 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29376] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fad352000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex19_1_krylovschur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex19_2.counts not ok eps_tutorials-ex19_1_lobpcg # Error code: 14 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29404] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbf114000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29407] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29407] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex19_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_1.counts not ok eps_tutorials-ex19_2 # Error code: 14 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29436] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb4281000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29445] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29445] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex19_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_2.counts not ok eps_tutorials-ex2_1 # Error code: 14 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29464] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb0b79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29467] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29467] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_ciss_2.counts not ok eps_tutorials-ex2_2 # Error code: 14 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29495] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa96e7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_3.counts not ok eps_tutorials-ex2_ciss_2 # Error code: 14 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f98610000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29527] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29524] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:29528] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29527] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29528] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-29524@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_ciss_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_4.counts not ok eps_tutorials-ex2_3 # Error code: 14 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f97aba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29566] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29566] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_4_filter.counts not ok eps_tutorials-ex2_4 # Error code: 14 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29587] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f996fd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29590] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29590] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_5.counts not ok eps_tutorials-ex2_4_filter+eps_type-krylovschur_st_filter_type-filtlan # Error code: 14 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa0d32000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29625] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29625] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff not ok eps_tutorials-ex2_5 # Error code: 14 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa178a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29650] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29650] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6.counts not ok eps_tutorials-ex2_4_filter+eps_type-krylovschur_st_filter_type-chebyshev # Error code: 14 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa6807000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff not ok eps_tutorials-ex2_6 # Error code: 14 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29694] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa057d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29699] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29699] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6_rel_large.counts not ok eps_tutorials-ex2_4_filter+eps_type-subspace_st_filter_type-filtlan # Error code: 14 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29711] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb92bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29714] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29714] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff not ok eps_tutorials-ex2_6_rel_large # Error code: 14 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29743] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9e479000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29757] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29757] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_6_rel_large # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex2_6_rel_small.counts not ok eps_tutorials-ex2_4_filter+eps_type-subspace_st_filter_type-chebyshev # Error code: 14 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29758] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa9f8c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29761] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29761] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex2_4_filter # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1.counts not ok eps_tutorials-ex2_6_rel_small # Error code: 14 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29807] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb525c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29818] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29818] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex2_6_rel_small # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1_lobpcg.counts not ok eps_tutorials-ex24_1 # Error code: 14 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29816] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9793a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29821] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29821] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex24_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex24_1_gd.counts not ok eps_tutorials-ex24_1_lobpcg # Error code: 14 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29867] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8196e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29878] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29878] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex24_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex25_1_mumps.counts not ok eps_tutorials-ex24_1_gd # Error code: 14 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29875] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f85be4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29881] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29881] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex24_1_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex29_1.counts not ok eps_tutorials-ex25_1_mumps # Error code: 14 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29928] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f9f3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29938] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29938] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex25_1_mumps # SKIP Command failed so no diff not ok eps_tutorials-ex29_1 # Error code: 14 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29935] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb73fe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29941] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29941] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex3_1.counts ok eps_tutorials-ex29_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex3_2.counts not ok eps_tutorials-ex3_1 # Error code: 14 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29989] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9b85c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:29998] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:29998] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex3_1 # SKIP Command failed so no diff not ok eps_tutorials-ex3_2 # Error code: 14 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:29995] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa2a15000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30001] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30001] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex30_1.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex31_1.counts not ok eps_tutorials-ex30_1 # Error code: 14 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30053] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa0798000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30064] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30064] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex30_1 # SKIP Command failed so no diff not ok eps_tutorials-ex31_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_1.counts # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30061] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f96040000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30067] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30067] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex31_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_2.counts not ok eps_tutorials-ex34_1 # Error code: 14 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30114] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb10cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30124] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30124] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_1 # SKIP Command failed so no diff not ok eps_tutorials-ex34_2+form_function_ab-0 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_3.counts # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30121] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93549000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30127] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30127] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_2 # SKIP Command failed so no diff not ok eps_tutorials-ex34_2+form_function_ab-1 # Error code: 14 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30161] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa4a94000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30171] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30171] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_2 # SKIP Command failed so no diff not ok eps_tutorials-ex34_3 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_4.counts # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30168] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f98839000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30174] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30174] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_5.counts not ok eps_tutorials-ex34_4+form_function_ab-0 # Error code: 14 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30221] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa68e6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30231] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30231] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_4 # SKIP Command failed so no diff not ok eps_tutorials-ex34_5+form_function_ab-0 # Error code: 14 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30228] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e6c9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_5 # SKIP Command failed so no diff not ok eps_tutorials-ex34_4+form_function_ab-1 # Error code: 14 not ok eps_tutorials-ex34_5+form_function_ab-1 # Error code: 14 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30254] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9f4b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30265] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30265] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_4 # SKIP Command failed so no diff # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f7f7b2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30268] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30268] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_6.counts TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_7.counts not ok eps_tutorials-ex34_6+form_function_ab-0 # Error code: 14 not ok eps_tutorials-ex34_7 # Error code: 14 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30322] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f937d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30328] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30328] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30320] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa9f77000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30327] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30327] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_7 # SKIP Command failed so no diff ok eps_tutorials-ex34_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_8.counts not ok eps_tutorials-ex34_6+form_function_ab-1 # Error code: 14 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30355] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1b4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_6 # SKIP Command failed so no diff not ok eps_tutorials-ex34_8+form_function_ab-0 # Error code: 14 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30372] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa3979000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30375] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:30375] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_9.counts not ok eps_tutorials-ex34_8+form_function_ab-1 # Error code: 14 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30403] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbb540000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30409] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30409] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex34_10.counts not ok eps_tutorials-ex34_9+use_custom_norm-0 # Error code: 14 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f80e4d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex34_9 # SKIP Command failed so no diff not ok eps_tutorials-ex34_10+use_custom_norm-0_form_function_ab-0 # Error code: 14 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30451] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb2b6f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30466] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff not ok eps_tutorials-ex34_9+use_custom_norm-1 # Error code: 14 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30464] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb05b3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30469] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30469] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex35_1.counts not ok eps_tutorials-ex34_10+use_custom_norm-0_form_function_ab-1 # Error code: 14 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f99610000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30505] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30505] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff not ok eps_tutorials-ex35_1 # Error code: 14 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30513] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f81f7c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30516] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30516] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex35_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_1.counts not ok eps_tutorials-ex34_10+use_custom_norm-1_form_function_ab-0 # Error code: 14 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30530] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb86dd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30548] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30548] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff not ok eps_tutorials-ex36_1 # Error code: 14 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30560] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f92021000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30563] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30563] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_2.counts not ok eps_tutorials-ex34_10+use_custom_norm-1_form_function_ab-1 # Error code: 14 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30577] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa0d69000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30594] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30594] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex34_10 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex36_3.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-constant_eps_two_sided-0 # Error code: 14 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30607] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba6a6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30610] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30610] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex36_2 # SKIP Command failed so no diff not ok eps_tutorials-ex36_3 # Error code: 14 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbf48d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30654] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30654] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex4_1.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-constant_eps_two_sided-1 # Error code: 14 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30651] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb44eb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30657] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30657] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_2 # SKIP Command failed so no diff not ok eps_tutorials-ex4_1 # Error code: 14 not ok eps_tutorials-ex36_2+eps_power_shift_type-rayleigh_eps_two_sided-0 # Error code: 14 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30698] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb3e88000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30704] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30704] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9290c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex4_1 # SKIP Command failed so no diff ok eps_tutorials-ex36_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex41_1.counts not ok eps_tutorials-ex36_2+eps_power_shift_type-rayleigh_eps_two_sided-1 # Error code: 14 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30731] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa99ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30748] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30748] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex36_2 # SKIP Command failed so no diff not ok eps_tutorials-ex41_1+eps_type-power # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex41_1_balance.counts # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30745] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9668000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30751] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30751] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex41_1 # SKIP Command failed so no diff not ok eps_tutorials-ex41_1+eps_type-krylovschur # Error code: 14 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30785] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f89883000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30795] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30795] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex41_1 # SKIP Command failed so no diff not ok eps_tutorials-ex41_1_balance+eps_balance-oneside # Error code: 14 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fba613000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex41_1_balance # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex43_1.counts not ok eps_tutorials-ex41_1_balance+eps_balance-twoside # Error code: 14 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30832] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbdfde000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30842] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30842] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex41_1_balance # SKIP Command failed so no diff not ok eps_tutorials-ex43_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex43_2.counts # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30839] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fab6fe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30845] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30845] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex43_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_1.counts not ok eps_tutorials-ex44_1+eps_type-krylovschur # Error code: 14 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30899] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faa850000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30906] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30906] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_1 # SKIP Command failed so no diff not ok eps_tutorials-ex44_1+eps_type-lyapii # Error code: 14 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30924] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8afd3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30927] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30927] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_2.counts not ok eps_tutorials-ex43_2 # Error code: 14 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8a572000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30903] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30903] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30892] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:30902] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-30892@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex43_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex44_3.counts not ok eps_tutorials-ex44_2+eps_type-krylovschur # Error code: 14 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f843b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30969] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30969] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex44_2 # SKIP Command failed so no diff not ok eps_tutorials-ex44_3+eps_type-krylovschur # Error code: 14 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30982] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb76c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:30985] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:30985] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex44_3 # SKIP Command failed so no diff not ok eps_tutorials-ex44_2+eps_type-lyapii # Error code: 14 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:30999] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb1aea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31004] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31004] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex44_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex46_1.counts not ok eps_tutorials-ex44_3+eps_type-lyapii # Error code: 14 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31016] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faaee0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31019] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31019] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex44_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex46_2.counts not ok eps_tutorials-ex46_1 # Error code: 14 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31048] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f95d66000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex46_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex47_1.counts not ok eps_tutorials-ex46_2 # Error code: 14 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31076] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9ff6f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31079] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31079] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex46_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1.counts not ok eps_tutorials-ex47_1 # Error code: 14 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31108] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbe37b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex47_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1_jd.counts not ok eps_tutorials-ex49_1 # Error code: 14 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31136] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f89e99000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31139] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31139] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex49_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_1_lobpcg.counts not ok eps_tutorials-ex49_1_jd # Error code: 14 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31168] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f87de7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31184] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31184] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex49_1_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2.counts not ok eps_tutorials-ex49_1_lobpcg # Error code: 14 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31196] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9b222000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31199] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31199] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex49_1_lobpcg # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2_nost.counts not ok eps_tutorials-ex49_2 # Error code: 14 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31228] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f81204000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31239] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31239] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex49_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex49_2_par.counts not ok eps_tutorials-ex49_2_nost # Error code: 14 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9fd77000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31259] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31259] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex49_2_nost # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex5_1.counts not ok eps_tutorials-ex49_2_par # Error code: 14 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31288] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83cf1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31303] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31303] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31288@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex49_2_par # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex5_2.counts not ok eps_tutorials-ex5_2 # Error code: 14 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31354] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f93b39000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31357] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31357] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real.counts not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nsize-1 # Error code: 14 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31384] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f96cf3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31387] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31387] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-0_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbcdab000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31317] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31320] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31321] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31321] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31320] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31317@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-shao_nsize-2 # Error code: 14 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa6419000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31401] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31405] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31404] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31405] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31404] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31401@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nsize-1 # Error code: 14 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31441] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f88817000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31444] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31444] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-0_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8a12d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31417] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31424] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31424] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:31425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31417@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-gruning_nsize-2 # Error code: 14 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb5980000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31462] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31461] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31461] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31458@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:31458] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../src/server/pmix_server.c at line 3171 # ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nsize-1 # Error code: 14 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31498] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f96567000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real+eps_krylovschur_bse_type-projectedbse_nsize-2 # Error code: 14 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9e470000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31515] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31519] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31518] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31518] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31519] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31515@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real_sinvert.counts not ok eps_tutorials-ex5_1+eps_two_sided-1_eps_krylovschur_locking-0 # Error code: 14 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31478] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb7ff5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31482] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31481] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:31481] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31478@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex5_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31560] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83129000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31567] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31567] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31585] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8d21c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31588] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31588] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff not ok eps_tutorials-ex5_1+eps_two_sided-1_eps_krylovschur_locking-1 # Error code: 14 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f91b4f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31558] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31565] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31566] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31565] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31558@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_1_real_sinvert_scalapack.counts not ok eps_tutorials-ex55_1_real_sinvert+eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31602] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f90c09000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31607] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31607] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_1_real_sinvert # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_2_real.counts not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f82e14000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31634] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31633] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31634] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31633] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31630@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31675] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa1c79000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31684] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31684] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbddeb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31686] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31679] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31687] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31687] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31686] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31679@1,2] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31712] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f96db7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_1_real_sinvert_scalapack+eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbdf65000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31724] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31726] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31724] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31726] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31718@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex55_1_real_sinvert_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex55_3.counts not ok eps_tutorials-ex55_2_real+eps_ncv-10_eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31745] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb4c3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31765] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31765] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_3 # Error code: 14 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31773] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9b667000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31776] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31776] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex56_1.counts not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-shao # Error code: 14 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31790] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f90512000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex56_1+nsize-1 # Error code: 14 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31820] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa65e2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31823] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31823] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1 # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-gruning # Error code: 14 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31837] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbddf5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff not ok eps_tutorials-ex55_2_real+eps_ncv-24_eps_krylovschur_bse_type-projectedbse # Error code: 14 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31876] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb3deb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31879] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31879] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex55_2_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex56_1_nhep.counts not ok eps_tutorials-ex56_1+nsize-2 # Error code: 14 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb76ad000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31854] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31857] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31858] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31858] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31857] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31854@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_1.counts not ok eps_tutorials-ex56_1_nhep+nsize-1 # Error code: 14 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31906] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f86b08000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31909] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31909] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1_nhep # SKIP Command failed so no diff not ok eps_tutorials-ex57_1+nsize-1 # Error code: 14 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31934] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f85baf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31937] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31937] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex57_1 # SKIP Command failed so no diff not ok eps_tutorials-ex56_1_nhep+nsize-2 # Error code: 14 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f942d5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31955] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31955] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31951@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok eps_tutorials-ex56_1_nhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_1_nhep.counts not ok eps_tutorials-ex57_1_nhep+nsize-1 # Error code: 14 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32002] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb6155000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32009] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32009] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex57_1_nhep # SKIP Command failed so no diff not ok eps_tutorials-ex57_1+nsize-2 # Error code: 14 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f971a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:31969] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:31977] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:31976] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31976] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:31977] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-31969@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex57_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex57_2.counts not ok eps_tutorials-ex57_2 # Error code: 14 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32056] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8ef52000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32059] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32059] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex57_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex7_1.counts not ok eps_tutorials-ex57_1_nhep+nsize-2 # Error code: 14 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f99432000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32023] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:32027] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:32026] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32027] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32026] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-32023@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok eps_tutorials-ex57_1_nhep # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex7_3.counts not ok eps_tutorials-ex7_1 # Error code: 14 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32086] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f83083000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32089] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32089] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_1.counts not ok eps_tutorials-ex7_3 # Error code: 14 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32114] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbc540000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32117] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32117] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex7_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_2.counts not ok eps_tutorials-ex9_1+eps_two_sided-0_eps_type-krylovschur # Error code: 14 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32145] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f91919000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32149] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32149] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff not ok eps_tutorials-ex9_2 # Error code: 14 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb2287000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32177] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32177] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_3.counts not ok eps_tutorials-ex9_1+eps_two_sided-0_eps_type-lapack # Error code: 14 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9146f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32195] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32195] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_1 # SKIP Command failed so no diff not ok eps_tutorials-ex9_3 # Error code: 14 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32221] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa69b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32226] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32226] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_4.counts not ok eps_tutorials-ex9_1+eps_two_sided-1_eps_type-krylovschur # Error code: 14 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32238] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f87cfc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32241] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32241] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff not ok eps_tutorials-ex9_4 # Error code: 14 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32270] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f87ac6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32280] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32280] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_5.counts not ok eps_tutorials-ex9_1+eps_two_sided-1_eps_type-lapack # Error code: 14 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32285] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa3212000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32288] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32288] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_7.counts not ok eps_tutorials-ex9_5 # Error code: 14 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32330] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb4e96000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32345] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32345] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_8.counts not ok eps_tutorials-ex9_7 # Error code: 14 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32344] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8c5d8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32348] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32348] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/eps_tutorials-ex9_9.counts not ok eps_tutorials-ex9_8 # Error code: 14 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32388] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f844cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32403] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32403] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok eps_tutorials-ex9_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1.counts not ok eps_tutorials-ex9_9 # Error code: 14 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32405] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa93da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32408] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32408] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok eps_tutorials-ex9_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1_cross_gd.counts not ok svd_tests-test1_1+type-lanczos # Error code: 14 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32449] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9b284000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32464] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32464] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test1_1_cross_gd # Error code: 14 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32465] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f80af7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32468] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32468] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1_cross_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test1_1_cyclic_gd.counts not ok svd_tests-test1_1+type-trlanczos # Error code: 14 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32484] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7bf7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32495] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32495] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test1_1_cyclic_gd # Error code: 14 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32512] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f89f7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32515] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32515] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test1_1_cyclic_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_1.counts not ok svd_tests-test1_1+type-cross # Error code: 14 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f90be6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test10_1 # Error code: 14 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32559] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f99f68000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32563] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32563] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_2.counts not ok svd_tests-test1_1+type-cyclic # Error code: 14 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9a626000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff not ok svd_tests-test10_2 # Error code: 14 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32608] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97f47000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32614] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32614] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test10_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test10_3.counts not ok svd_tests-test1_1+type-lapack # Error code: 14 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32623] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fade82000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32626] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32626] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test11_1.counts not ok svd_tests-test10_3 # Error code: 14 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32660] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa5a29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32676] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32676] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test10_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test12_1.counts not ok svd_tests-test11_1 # Error code: 14 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f951f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32686] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32686] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1.counts not ok svd_tests-test12_1 # Error code: 14 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32718] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f916a9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32737] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32737] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1_cross.counts not ok svd_tests-test14_1+svd_type-lanczos # Error code: 14 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32743] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a9b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32746] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32746] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1 # SKIP Command failed so no diff not ok svd_tests-test14_1_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32780] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbcf4e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32790] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32790] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # not ok svd_tests-test14_1+svd_type-trlanczos # Error code: 14 ok svd_tests-test14_1_cross # SKIP Command failed so no diff # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32787] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9950c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32793] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32793] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_1 # SKIP Command failed so no diff not ok svd_tests-test14_1_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32818] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7905000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32824] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32824] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_1_cyclic.counts not ok svd_tests-test14_1+svd_type-lapack # Error code: 14 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32821] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8897a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32827] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32827] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2.counts not ok svd_tests-test14_1_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32874] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f95c35000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32884] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32884] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1_cyclic # SKIP Command failed so no diff not ok svd_tests-test14_2+svd_type-lanczos # Error code: 14 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32881] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa20bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32887] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32887] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2 # SKIP Command failed so no diff not ok svd_tests-test14_1_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbb49f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32918] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_1_cyclic # SKIP Command failed so no diff not ok svd_tests-test14_2+svd_type-trlanczos # Error code: 14 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32915] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8604d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32921] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32921] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2_cross.counts ok svd_tests-test14_2 # SKIP Command failed so no diff not ok svd_tests-test14_2+svd_type-lapack # Error code: 14 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32955] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9168000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32965] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32965] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test14_2_cyclic.counts not ok svd_tests-test14_2_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:32962] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f92674000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:32968] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:32968] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cross # SKIP Command failed so no diff not ok svd_tests-test14_2_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33004] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f92dc0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33012] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33012] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test14_2_cross # SKIP Command failed so no diff not ok svd_tests-test14_2_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33009] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa864f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33015] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33015] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_1.counts not ok svd_tests-test14_2_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33048] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa8320000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33059] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33059] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test14_2_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_2.counts not ok svd_tests-test15_1+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33057] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9d11d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33062] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33062] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_1 # SKIP Command failed so no diff not ok svd_tests-test15_1+svd_trlanczos_gbidiag-upper # Error code: 14 not ok svd_tests-test15_2 # Error code: 14 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33103] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8f010000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33109] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33109] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test15_1 # SKIP Command failed so no diff # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33101] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f94e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33108] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33108] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_3.counts not ok svd_tests-test15_1+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33136] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f97a21000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33153] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33153] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test15_1 # SKIP Command failed so no diff not ok svd_tests-test15_3 # Error code: 14 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33150] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7969000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33156] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33156] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test15_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test15_4.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_lapack.counts not ok svd_tests-test15_4 # Error code: 14 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33205] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f92e5f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33213] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33213] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test15_4 # SKIP Command failed so no diff not ok svd_tests-test16_1_lapack # Error code: 14 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33210] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f97c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33216] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33216] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_cyclic.counts not ok svd_tests-test16_1_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 not ok svd_tests-test16_1_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33270] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf50c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33276] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33276] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33267] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3faf3ba000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33274] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33274] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test16_1_cyclic # SKIP Command failed so no diff ok svd_tests-test16_1_cross # SKIP Command failed so no diff not ok svd_tests-test16_1_cross+svd_cross_explicitmatrix-1 # Error code: 14 not ok svd_tests-test16_1_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33304] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f915d7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test16_1_cross # SKIP Command failed so no diff # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33303] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1551000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test16_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test16_1_trlanczos_par.counts not ok svd_tests-test16_1_trlanczos+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33362] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fae645000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test16_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test16_1_trlanczos+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33389] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb527f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33392] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33392] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test16_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_1.counts not ok svd_tests-test16_1_trlanczos_par+ds_parallel-redundant # Error code: 14 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb6053000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:33371] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-33364@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tests-test16_1_trlanczos_par # SKIP Command failed so no diff not ok svd_tests-test18_1+svd_type-lapack # Error code: 14 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbb469000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33432] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33432] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff not ok svd_tests-test18_1+svd_type-cross # Error code: 14 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33456] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f89181000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33459] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33459] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff not ok svd_tests-test16_1_trlanczos_par+ds_parallel-synchronized # Error code: 14 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbb713000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:33438] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:33437] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33438] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33437] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-33434@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tests-test16_1_trlanczos_par # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_1_trlanczos.counts not ok svd_tests-test18_1+svd_type-cyclic # Error code: 14 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33473] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f83082000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33476] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33476] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test18_2.counts not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-single # Error code: 14 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33501] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faab64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33506] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33506] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test18_2 # Error code: 14 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33531] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3facd4f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test19_1.counts not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33548] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f99f0e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33551] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33551] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test19_1 # Error code: 14 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33580] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fabeb0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33587] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33587] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test19_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test2_1.counts ok svd_tests-test2_1 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test20_1.counts not ok svd_tests-test18_1_trlanczos+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33595] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbd099000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33598] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33598] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test18_1_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lanczos.counts not ok svd_tests-test20_1 # Error code: 14 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33653] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbbd9a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lanczos_one.counts not ok svd_tests-test3_1_lanczos # Error code: 14 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33668] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f92aa5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33671] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33671] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos.counts not ok svd_tests-test3_1_lanczos_one # Error code: 14 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f816a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33728] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33728] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_lanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one.counts not ok svd_tests-test3_1_trlanczos+svd_trlanczos_locking-0 # Error code: 14 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33726] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa89f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33731] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33731] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test3_1_trlanczos+svd_trlanczos_locking-1 # Error code: 14 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33768] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2946000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_trlanczos # SKIP Command failed so no diff not ok svd_tests-test3_1_trlanczos_one # Error code: 14 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33772] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f948a9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33778] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33778] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_trlanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one_mgs.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_trlanczos_one_always.counts not ok svd_tests-test3_1_trlanczos_one_mgs # Error code: 14 not ok svd_tests-test3_1_trlanczos_one_always # Error code: 14 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33829] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f948f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33836] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33836] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33832] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa3f52000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33838] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33838] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_trlanczos_one_mgs # SKIP Command failed so no diff ok svd_tests-test3_1_trlanczos_one_always # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cross_exp.counts not ok svd_tests-test3_1_cross # Error code: 14 not ok svd_tests-test3_1_cross_exp # Error code: 14 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33892] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb50ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33898] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33898] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_1_cross_exp # SKIP Command failed so no diff # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33890] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbdc13000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33896] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33896] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cyclic.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_cyclic_exp.counts not ok svd_tests-test3_1_cyclic_exp # Error code: 14 not ok svd_tests-test3_1_cyclic # Error code: 14 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33952] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f85503000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33957] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33957] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:33951] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f818ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:33958] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:33958] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_cyclic_exp # SKIP Command failed so no diff ok svd_tests-test3_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_lapack.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_1_randomized.counts not ok svd_tests-test3_1_lapack # Error code: 14 not ok svd_tests-test3_1_randomized # Error code: 14 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbec9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34017] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34017] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34012] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb5ba1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34018] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34018] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_1_lapack # SKIP Command failed so no diff ok svd_tests-test3_1_randomized # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lanczos_one.counts not ok svd_tests-test3_2_lanczos # Error code: 14 not ok svd_tests-test3_2_lanczos_one # Error code: 14 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34071] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8538c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_lanczos # SKIP Command failed so no diff # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34072] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8bd0f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34078] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34078] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_lanczos_one # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one.counts not ok svd_tests-test3_2_trlanczos # Error code: 14 not ok svd_tests-test3_2_trlanczos_one # Error code: 14 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34131] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8d5d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34137] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34137] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_trlanczos_one # SKIP Command failed so no diff # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34132] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbcef7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34138] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34138] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one_mgs.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_trlanczos_one_always.counts not ok svd_tests-test3_2_trlanczos_one_mgs # Error code: 14 not ok svd_tests-test3_2_trlanczos_one_always # Error code: 14 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa83ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34197] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34197] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_2_trlanczos_one_mgs # SKIP Command failed so no diff # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34192] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb5740000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34198] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34198] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_trlanczos_one_always # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cross_exp.counts not ok svd_tests-test3_2_cross_exp # Error code: 14 not ok svd_tests-test3_2_cross # Error code: 14 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34252] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa9132000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34258] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34258] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_cross_exp # SKIP Command failed so no diff # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34250] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3faa49b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34257] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34257] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_cyclic.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_lapack.counts not ok svd_tests-test3_2_lapack # Error code: 14 not ok svd_tests-test3_2_cyclic # Error code: 14 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34311] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f944c3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34318] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34318] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34312] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb4887000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_cyclic # SKIP Command failed so no diff ok svd_tests-test3_2_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_2_randomized.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_4.counts not ok svd_tests-test3_2_randomized # Error code: 14 not ok svd_tests-test3_4 # Error code: 14 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34371] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb3ec3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34378] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:34379] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34378] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34379] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-34371@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34372] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8240d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34377] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34377] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test3_2_randomized # SKIP Command failed so no diff ok svd_tests-test3_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test3_5.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_lanczos.counts not ok svd_tests-test3_5 # Error code: 14 not ok svd_tests-test4_1_lanczos # Error code: 14 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34435] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f94e62000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34441] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34441] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test3_5 # SKIP Command failed so no diff # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34434] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8e46b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34440] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34440] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_lanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_randomized.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_trlanczos.counts not ok svd_tests-test4_1_randomized # Error code: 14 not ok svd_tests-test4_1_trlanczos # Error code: 14 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34495] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f91d5c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34500] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34500] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_trlanczos # SKIP Command failed so no diff # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34494] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb9b1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34501] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34501] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_randomized # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross_exp.counts not ok svd_tests-test4_1_cross # Error code: 14 not ok svd_tests-test4_1_cross_exp # Error code: 14 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34555] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb8f29000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34560] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34560] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34554] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb4615000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34561] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34561] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_cross # SKIP Command failed so no diff ok svd_tests-test4_1_cross_exp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cross_exp_imp.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic.counts not ok svd_tests-test4_1_cyclic # Error code: 14 not ok svd_tests-test4_1_cross_exp_imp # Error code: 14 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34615] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fab0af000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34621] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34621] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34614] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9668c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34620] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34620] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_cross_exp_imp # SKIP Command failed so no diff ok svd_tests-test4_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic_imp.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_cyclic_exp.counts not ok svd_tests-test4_1_cyclic_exp # Error code: 14 not ok svd_tests-test4_1_cyclic_imp # Error code: 14 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34674] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8faf9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34681] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34681] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34675] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbb54b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34680] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34680] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test4_1_cyclic_imp # SKIP Command failed so no diff ok svd_tests-test4_1_cyclic_exp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_lapack.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_1_scalapack.counts not ok svd_tests-test4_1_scalapack # Error code: 14 not ok svd_tests-test4_1_lapack # Error code: 14 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34734] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa1e5d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34740] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34740] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34735] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb0af5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34741] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34741] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test4_1_scalapack # SKIP Command failed so no diff ok svd_tests-test4_1_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test4_3.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_1.counts not ok svd_tests-test5_1 # Error code: 14 not ok svd_tests-test4_3 # Error code: 14 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34794] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faea63000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34800] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34800] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test5_1 # SKIP Command failed so no diff # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f973b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34795] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:34802] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:34801] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34802] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34801] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-34795@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tests-test4_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_2.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_3.counts not ok svd_tests-test5_3 # Error code: 14 not ok svd_tests-test5_2 # Error code: 14 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34858] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa2eb0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34863] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34863] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test5_3 # SKIP Command failed so no diff # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8eb12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34864] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34864] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test5_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test5_4.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test6_1_subspace.counts not ok svd_tests-test5_4 # Error code: 14 not ok svd_tests-test6_1_subspace # Error code: 14 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34918] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9f36000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34924] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34924] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test5_4 # SKIP Command failed so no diff # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fab366000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test6_1_subspace # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test6_1_lobpcg.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test7_1.counts not ok svd_tests-test7_1 # Error code: 14 not ok svd_tests-test6_1_lobpcg # Error code: 14 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34976] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fab287000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34984] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34984] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:34978] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa2cf1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:34983] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:34983] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test6_1_lobpcg # SKIP Command failed so no diff ok svd_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test8_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tests-test9_1.counts not ok svd_tests-test9_1+svd_type-lanczos # Error code: 14 not ok svd_tests-test8_1+svd_type-lanczos # Error code: 14 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35037] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9559b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35043] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35043] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test9_1 # SKIP Command failed so no diff # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35038] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e09a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35044] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35044] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [warn] Epoll MOD(4) on fd 19 failed. Old events were 6; read change was 2 (del); write change was 0 (none); close change was 0 (none): Bad file descriptor ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test8_1+svd_type-trlanczos # Error code: 14 not ok svd_tests-test9_1+svd_type-trlanczos # Error code: 14 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35072] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb4e98000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35077] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35077] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9ca2c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35078] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35078] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test8_1 # SKIP Command failed so no diff ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-cross # Error code: 14 not ok svd_tests-test8_1+svd_type-cross # Error code: 14 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35105] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8c12f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35112] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35112] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35106] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8542b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35111] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35111] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test8_1 # SKIP Command failed so no diff ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-cyclic # Error code: 14 not ok svd_tests-test8_1+svd_type-cyclic # Error code: 14 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35140] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f99ebf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35146] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35146] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test9_1 # SKIP Command failed so no diff # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35138] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9f1d9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35145] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35145] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test8_1 # SKIP Command failed so no diff not ok svd_tests-test9_1+svd_type-lapack # Error code: 14 not ok svd_tests-test8_1+svd_type-lapack # Error code: 14 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35173] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb266e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35180] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35180] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35174] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa251b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35179] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35179] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tests-test8_1 # SKIP Command failed so no diff ok svd_tests-test9_1 # SKIP Command failed so no diff not ok svd_tests-test8_1+svd_type-randomized # Error code: 14 not ok svd_tests-test9_1+svd_type-randomized # Error code: 14 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35208] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa684e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35213] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35213] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35207] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9a1a8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35214] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35214] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tests-test8_1 # SKIP Command failed so no diff ok svd_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_1.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_1_scalapack.counts not ok svd_tutorials-ex14_1_scalapack+nsize-1 # Error code: 14 not ok svd_tutorials-ex14_1+svd_type-trlanczos # Error code: 14 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35267] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f7b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35274] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35274] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35268] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f958b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35273] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35273] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_1 # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-lanczos # Error code: 14 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35302] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8a4ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_1 # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-randomized # Error code: 14 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35327] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb8224000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_1 # SKIP Command failed so no diff not ok svd_tutorials-ex14_1_scalapack+nsize-2 # Error code: 14 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35298] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb8707000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35308] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35307] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35308] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35307] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35298@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff not ok svd_tutorials-ex14_1+svd_type-cross # Error code: 14 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35344] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f87f72000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35355] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35355] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_2.counts not ok svd_tutorials-ex14_1_scalapack+nsize-3 # Error code: 14 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3facbc9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35364] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35362] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35364] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:35363] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35362] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35359@1,2] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex14_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_2_cross.counts not ok svd_tutorials-ex14_2 # Error code: 14 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8543d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_3.counts not ok svd_tutorials-ex14_2_cross # Error code: 14 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35425] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa5020000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35428] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_2_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex14_4.counts not ok svd_tutorials-ex14_3 # Error code: 14 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35465] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faf283000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35482] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15_1.counts not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-trlanczos # Error code: 14 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f837fa000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35488] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35488] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1 # Error code: 14 not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-lanczos # Error code: 14 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa2ccd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35535] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35535] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35527] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb5472000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35534] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35534] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex15_1_scalapack.counts not ok svd_tutorials-ex14_4+svd_ncv-26_svd_type-cross # Error code: 14 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35562] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faae42000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1_scalapack+nsize-1 # Error code: 14 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8aa96000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35582] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35582] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex15_1_scalapack # SKIP Command failed so no diff not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-trlanczos # Error code: 14 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35598] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa3a27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35613] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35613] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex15_1_scalapack+nsize-2 # Error code: 14 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f92cc2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35616] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35610] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:35617] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35617] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35616] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-35610@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex15_1_scalapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_1.counts not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-lanczos # Error code: 14 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35640] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f820ed000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35662] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35662] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff not ok svd_tutorials-ex45_1 # Error code: 14 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35663] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb8b24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35666] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35666] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_2.counts not ok svd_tutorials-ex14_4+svd_ncv-12_svd_type-cross # Error code: 14 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35682] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8b037000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35703] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35703] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex14_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_3.counts not ok svd_tutorials-ex45_2 # Error code: 14 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35710] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb365c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35713] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35713] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_4.counts not ok svd_tutorials-ex45_3 # Error code: 14 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35743] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa803d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35763] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35763] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5.counts not ok svd_tutorials-ex45_4 # Error code: 14 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35770] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f95433000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35773] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35773] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cross.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35805] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb2a91000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35823] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35823] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cross # Error code: 14 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35830] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8c778000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35833] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35833] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cross_implicit.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35848] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f849c1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35852] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35852] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cross_implicit # Error code: 14 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35877] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb0778000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35880] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35880] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_5_cross_implicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_5_cyclic.counts not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35894] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f96247000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35897] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35897] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff not ok svd_tutorials-ex45_5_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35938] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f883bd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35944] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35944] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_5_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex45_5+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35933] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f957a5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35941] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35941] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6.counts not ok svd_tutorials-ex45_5_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35960] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa5364000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35971] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35971] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_5_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6_cross.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:35988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8c35e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:35991] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:35991] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cross+svd_cross_explicitmatrix-0 # Error code: 14 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36020] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f918f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36031] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36031] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36035] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f94acd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36038] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36038] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f81a6e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36067] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36067] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_6_cyclic.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36069] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f93730000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36072] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36072] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36104] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9b2c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36116] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36116] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-single_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36113] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9768c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36119] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36119] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36139] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fad81d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36150] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36150] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36147] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa5e1d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36153] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36153] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-single_svd_trlanczos_oneside-0 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36191] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9c678000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36199] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36199] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36194] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbac64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36200] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36200] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-single_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36228] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa99d0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36231] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36231] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36227] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa9606000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-upper_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36262] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f86353000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36268] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:36268] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb3b6a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-0_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36295] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa1655000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36301] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36301] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36296] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8f4cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_7 # SKIP Command failed so no diff ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-0_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36329] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa9a88000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36335] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36335] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7 # SKIP Command failed so no diff # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36330] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa31d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36336] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36336] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-1_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8f660000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb4fd9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff ok svd_tutorials-ex45_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7_cross.counts not ok svd_tutorials-ex45_6+svd_trlanczos_gbidiag-lower_svd_trlanczos_locking-1_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36398] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbeccf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36414] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36414] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_6 # SKIP Command failed so no diff not ok svd_tutorials-ex45_7_cross+svd_cross_explicitmatrix-0 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_7_cyclic.counts # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36411] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93841000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36417] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36417] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_7_cross+svd_cross_explicitmatrix-1 # Error code: 14 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36452] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa58ac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36461] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36461] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7_cross # SKIP Command failed so no diff not ok svd_tutorials-ex45_7_cyclic+svd_cyclic_explicitmatrix-0 # Error code: 14 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb9fe3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36464] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36464] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_7_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex45_8.counts not ok svd_tutorials-ex45_7_cyclic+svd_cyclic_explicitmatrix-1 # Error code: 14 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36498] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa9e11000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36508] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36508] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_7_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1.counts not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-upper_svd_trlanczos_scale-0.1 # Error code: 14 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36505] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83c5e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36511] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36511] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex45_8 # SKIP Command failed so no diff not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-upper_svd_trlanczos_scale--20 # Error code: 14 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36547] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c18f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36555] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36555] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_8 # SKIP Command failed so no diff not ok svd_tutorials-ex48_1+svd_trlanczos_explicitmatrix-0 # Error code: 14 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36552] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa94a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36558] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36558] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1 # SKIP Command failed so no diff not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-lower_svd_trlanczos_scale-0.1 # Error code: 14 not ok svd_tutorials-ex48_1+svd_trlanczos_explicitmatrix-1 # Error code: 14 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36580] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa5f73000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36592] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36592] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36586] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f89760000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36591] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36591] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_1 # SKIP Command failed so no diff ok svd_tutorials-ex45_8 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_spqr.counts not ok svd_tutorials-ex45_8+svd_trlanczos_gbidiag-lower_svd_trlanczos_scale--20 # Error code: 14 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36619] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faf4a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36636] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36636] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex45_8 # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_spqr+svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36633] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fa99e4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36639] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36639] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_1_spqr # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_autoscale.counts not ok svd_tutorials-ex48_1_spqr+svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36673] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb2450000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36683] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36683] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_spqr # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-0 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_cross.counts # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36680] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9f7b9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36686] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36686] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-lower_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36720] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fba7cb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36730] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36730] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_cross # Error code: 14 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36727] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb537b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36733] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36733] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_1_cyclic.counts not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-0 # Error code: 14 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36751] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9c722000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36775] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36775] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff not ok svd_tutorials-ex48_1_cyclic # Error code: 14 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36777] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f82786000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36780] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36780] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4.counts not ok svd_tutorials-ex48_1_autoscale+svd_trlanczos_gbidiag-upper_svd_trlanczos_oneside-1 # Error code: 14 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36796] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa2014000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36814] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36814] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_1_autoscale # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_spqr.counts not ok svd_tutorials-ex48_4 # Error code: 14 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36824] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9b82d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36827] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36827] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cross.counts not ok svd_tutorials-ex48_4_spqr # Error code: 14 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36856] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3facbf1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36873] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36873] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4_spqr # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cross_implicit.counts not ok svd_tutorials-ex48_4_cross # Error code: 14 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36884] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb96da000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36887] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36887] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_4_cyclic.counts not ok svd_tutorials-ex48_4_cross_implicit # Error code: 14 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbdb78000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36932] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36932] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_4_cross_implicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_5.counts not ok svd_tutorials-ex48_4_cyclic # Error code: 14 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36944] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9a391000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36947] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36947] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_4_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_5_cross.counts not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:36976] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1685000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:36991] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:36991] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_5 # SKIP Command failed so no diff not ok svd_tutorials-ex48_5_cross # Error code: 14 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37004] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb8e5e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37007] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37007] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_6_cross.counts not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-upper # Error code: 14 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37021] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f805a8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37026] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37026] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex48_5 # SKIP Command failed so no diff not ok svd_tutorials-ex48_5+svd_trlanczos_gbidiag-single # Error code: 14 not ok svd_tutorials-ex48_6_cross # Error code: 14 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37059] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb0099000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37069] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37069] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa8428000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37071] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37071] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex48_5 # SKIP Command failed so no diff ok svd_tutorials-ex48_6_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex48_6_cyclic.counts TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex51_1.counts not ok svd_tutorials-ex51_1+svd_trlanczos_gbidiag-upper # Error code: 14 not ok svd_tutorials-ex48_6_cyclic # Error code: 14 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb5dd9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37131] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37131] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37124] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb8990000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37130] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37130] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex51_1 # SKIP Command failed so no diff ok svd_tutorials-ex48_6_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex51_2.counts not ok svd_tutorials-ex51_1+svd_trlanczos_gbidiag-lower # Error code: 14 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37158] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbce1e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37173] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** and MPI will try to terminate your MPI job as well) # [sbuild:37173] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex51_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_cross.counts ok svd_tutorials-ex52_1_cross # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_cyclic.counts ok svd_tutorials-ex52_1_cyclic # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_trlanczos.counts ok svd_tutorials-ex52_1_trlanczos # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_1_lapack.counts ok svd_tutorials-ex52_1_lapack # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_cross.counts ok svd_tutorials-ex52_2_cross # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_cyclic.counts ok svd_tutorials-ex52_2_cyclic # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_2_trlanczos.counts ok svd_tutorials-ex52_2_trlanczos # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_cross.counts not ok svd_tutorials-ex52_5_cross # Error code: 14 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37301] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fbafc2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37304] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37304] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_cyclic.counts not ok svd_tutorials-ex51_2 # Error code: 14 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37175] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb5a9b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37178] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37179] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37178] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37179] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37175@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex51_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_trlanczos.counts not ok svd_tutorials-ex52_5_cyclic # Error code: 14 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37331] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9e44f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37346] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37346] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_5_lapack.counts not ok svd_tutorials-ex52_5_trlanczos # Error code: 14 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37359] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb020b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37362] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37362] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_trlanczos # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex52_6.counts not ok svd_tutorials-ex52_5_lapack # Error code: 14 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37391] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fb648c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37408] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37408] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex52_5_lapack # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_trlanczos.counts not ok svd_tutorials-ex52_6+svd_type-trlanczos # Error code: 14 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37419] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f87988000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex52_6 # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-redundant_nsize-1 # Error code: 14 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37451] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f861d4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37466] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37466] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex52_6+svd_type-cross # Error code: 14 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37465] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb84d7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37469] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37469] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex52_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_cross.counts not ok svd_tutorials-ex53_1_cross+nsize-1 # Error code: 14 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37514] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fab986000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37517] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37517] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cross # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-redundant_nsize-2 # Error code: 14 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbdb8f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37485] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37503] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37505] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37505] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37503] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37485@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_cross+nsize-2 # Error code: 14 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f81649000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37539] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37535] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37538] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37539] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37538] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37535@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex53_1_cross # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex53_1_cyclic.counts not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-synchronized_nsize-1 # Error code: 14 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37555] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f93066000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37575] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37575] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_cyclic+nsize-1 # Error code: 14 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37583] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f83706000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37586] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37586] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cyclic # SKIP Command failed so no diff not ok svd_tutorials-ex53_1_cyclic+nsize-2 # Error code: 14 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9a141000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37618] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37621] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37622] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37622] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37621] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37618@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok svd_tutorials-ex53_1_cyclic # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials-ex8_1.counts not ok svd_tutorials-ex53_1_trlanczos+ds_parallel-synchronized_nsize-2 # Error code: 14 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37600] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f965e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37611] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:37614] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37614] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37611] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-37600@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok svd_tutorials-ex53_1_trlanczos # SKIP Command failed so no diff not ok svd_tutorials-ex8_1 # Error code: 14 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37655] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8f624000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37658] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37658] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials-ex8_1 # SKIP Command failed so no diff CLINKER installed-arch-linux2-c-opt/tests/svd/tutorials/cnetwork/embedgsvd RM test-rm-svd.F90 TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test1_1.counts not ok pep_tests-test1_1+type-toar # Error code: 14 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37707] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9ebb3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37710] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37710] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1 # SKIP Command failed so no diff not ok pep_tests-test1_1+type-qarnoldi # Error code: 14 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37724] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8932c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37727] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37727] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test1_1 # SKIP Command failed so no diff not ok pep_tests-test1_1+type-linear # Error code: 14 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37741] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f7f90e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37744] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37744] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test1_1_linear_gd.counts not ok pep_tests-test1_1_linear_gd # Error code: 14 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9158b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37774] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37774] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test1_1_linear_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test10_1.counts not ok pep_tests-test10_1 # Error code: 14 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37801] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa5078000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test11_1.counts not ok pep_tests-test11_1 # Error code: 14 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37831] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f80b30000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37834] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37834] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test11_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test12_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1.counts not ok pep_tests-test12_1+pep_type-toar # Error code: 14 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37861] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f91694000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37872] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37872] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff not ok pep_tests-test2_1+pep_type-toar # Error code: 14 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37871] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fac93d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37875] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37875] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_1 # SKIP Command failed so no diff not ok pep_tests-test12_1+pep_type-linear # Error code: 14 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37891] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9e6a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37906] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37906] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff not ok pep_tests-test2_1+pep_type-linear # Error code: 14 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37904] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbbe45000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37909] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37909] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_toar_mgs.counts not ok pep_tests-test12_1+pep_type-qarnoldi # Error code: 14 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37925] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9a184000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37948] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37948] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test12_1 # SKIP Command failed so no diff not ok pep_tests-test2_1_toar_mgs # Error code: 14 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:37953] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb9dc0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:37956] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:37956] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_1_toar_mgs # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_qarnoldi.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_1_linear_gd.counts not ok pep_tests-test2_1_qarnoldi # Error code: 14 not ok pep_tests-test2_1_linear_gd # Error code: 14 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38007] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbb241000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38014] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38014] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb03bb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38016] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:38016] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_1_qarnoldi # SKIP Command failed so no diff ok pep_tests-test2_1_linear_gd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_toar_scaleboth.counts not ok pep_tests-test2_2+pep_type-toar # Error code: 14 not ok pep_tests-test2_2_toar_scaleboth # Error code: 14 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbbbec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38076] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:38076] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38069] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8ce3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38075] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38075] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_2 # SKIP Command failed so no diff ok pep_tests-test2_2_toar_scaleboth # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_toar_transform.counts not ok pep_tests-test2_2+pep_type-linear # Error code: 14 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38103] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f87bf1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38118] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38118] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_qarnoldi.counts not ok pep_tests-test2_2_toar_transform # Error code: 14 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38120] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb987d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38123] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38123] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_2_toar_transform # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_linear_explicit.counts not ok pep_tests-test2_2_qarnoldi # Error code: 14 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38167] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8abee000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38178] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38178] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_linear_explicit_her.counts not ok pep_tests-test2_2_linear_explicit # Error code: 14 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38180] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa8021000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38183] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38183] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_stoar.counts not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-0,1 # Error code: 14 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38226] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb5979000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38240] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38240] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff not ok pep_tests-test2_2_stoar # Error code: 14 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38239] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faed05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38243] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38243] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_2_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_2_jd.counts not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-1,0 # Error code: 14 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38259] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9455e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38275] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38275] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff not ok pep_tests-test2_2_jd # Error code: 14 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38287] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f929cf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38290] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38290] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_2_jd # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_3.counts not ok pep_tests-test2_2_linear_explicit_her+pep_linear_linearization-.3,.7 # Error code: 14 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38304] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3facdd7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_2_linear_explicit_her # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_schur.counts not ok pep_tests-test2_3+pep_extract-none # Error code: 14 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38334] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f920f1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38337] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38337] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_4_schur # Error code: 14 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f94907000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_mbe.counts not ok pep_tests-test2_3+pep_extract-norm # Error code: 14 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38381] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f858cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38384] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38384] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_4_mbe # Error code: 14 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38413] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f82565000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38428] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38428] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_mbe # SKIP Command failed so no diff not ok pep_tests-test2_3+pep_extract-residual # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_explicit.counts # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38426] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f82211000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38431] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38431] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_3 # SKIP Command failed so no diff not ok pep_tests-test2_4_explicit # Error code: 14 not ok pep_tests-test2_3+pep_extract-structured # Error code: 14 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38472] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb286d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38478] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38466] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f8ba52000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_4_explicit # SKIP Command failed so no diff ok pep_tests-test2_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_schur.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_mbe.counts not ok pep_tests-test2_4_multiple_mbe # Error code: 14 not ok pep_tests-test2_4_multiple_schur # Error code: 14 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38531] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8361d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38538] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38538] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_4_multiple_mbe # SKIP Command failed so no diff # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38532] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb69d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38537] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38537] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_4_multiple_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_4_multiple_explicit.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_5.counts not ok pep_tests-test2_4_multiple_explicit # Error code: 14 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38591] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa6633000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38597] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38597] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_4_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_6.counts not ok pep_tests-test2_6+pep_extract-none # Error code: 14 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38630] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f99f07000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38633] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38633] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_6 # SKIP Command failed so no diff not ok pep_tests-test2_6+pep_extract-norm # Error code: 14 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38647] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa8282000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38650] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38650] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_6 # SKIP Command failed so no diff not ok pep_tests-test2_5 # Error code: 14 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38592] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f92190000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38599] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:38598] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38598] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:38599] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-38592@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_7.counts not ok pep_tests-test2_6+pep_extract-residual # Error code: 14 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb7f27000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_schur.counts not ok pep_tests-test2_7+pep_extract-none # Error code: 14 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38692] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f9ac6e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_8_schur # Error code: 14 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38722] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9824f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38727] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38727] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_mbe.counts not ok pep_tests-test2_7+pep_extract-norm # Error code: 14 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38739] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f890d6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38742] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38742] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_8_mbe # Error code: 14 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38771] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa6cec000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38778] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38778] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_explicit.counts not ok pep_tests-test2_7+pep_extract-residual # Error code: 14 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38786] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb80d1000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38789] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38789] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_7 # SKIP Command failed so no diff not ok pep_tests-test2_7+pep_extract-structured # Error code: 14 not ok pep_tests-test2_8_explicit # Error code: 14 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f87782000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38835] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38835] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38830] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9e2f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38836] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38836] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_8_explicit # SKIP Command failed so no diff ok pep_tests-test2_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_schur.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_mbe.counts not ok pep_tests-test2_8_multiple_mbe # Error code: 14 not ok pep_tests-test2_8_multiple_schur # Error code: 14 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38889] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f887e8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38895] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38895] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38890] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa71b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38896] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38896] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_8_multiple_mbe # SKIP Command failed so no diff ok pep_tests-test2_8_multiple_schur # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_8_multiple_explicit.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_mbe.counts not ok pep_tests-test2_8_multiple_explicit # Error code: 14 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38949] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9b155000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38955] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38955] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test2_8_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_explicit.counts not ok pep_tests-test2_9_explicit # Error code: 14 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38988] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb7e9d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38992] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:38991] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38992] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:38991] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-38988@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok pep_tests-test2_9_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_multiple_mbe.counts not ok pep_tests-test2_9_mbe # Error code: 14 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa8e8a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:38950] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:38956] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:38957] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38956] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:38957] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-38950@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok pep_tests-test2_9_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_9_multiple_explicit.counts not ok pep_tests-test2_9_multiple_mbe # Error code: 14 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39021] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f85409000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39024] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39024] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39021@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # not ok pep_tests-test2_9_multiple_explicit # Error code: 14 ok pep_tests-test2_9_multiple_mbe # SKIP Command failed so no diff # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa752c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39054] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:39057] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:39058] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39058] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39057] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39054@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok pep_tests-test2_9_multiple_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_10.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_12.counts not ok pep_tests-test2_12 # Error code: 14 not ok pep_tests-test2_10 # Error code: 14 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39112] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f89f99000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39119] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39119] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-39112@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # [sbuild:39112] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 102 ok pep_tests-test2_12 # SKIP Command failed so no diff # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39110] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8176a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39118] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39118] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39110] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:39110] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 ok pep_tests-test2_10 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test2_13.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test3_1.counts not ok pep_tests-test2_13 # Error code: 14 not ok pep_tests-test3_1 # Error code: 14 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39183] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9775b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39190] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39190] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39184] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9f4a7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39189] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39189] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test2_13 # SKIP Command failed so no diff ok pep_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test4_1_real.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_1.counts not ok pep_tests-test5_1 # Error code: 14 not ok pep_tests-test4_1_real # Error code: 14 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39244] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f880f4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39249] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39249] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test4_1_real # SKIP Command failed so no diff # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39243] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9ba51000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39250] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39250] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_2.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_3.counts not ok pep_tests-test5_2 # Error code: 14 not ok pep_tests-test5_3 # Error code: 14 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39303] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa9ab7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39309] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39309] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39304] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f97214000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39310] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39310] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_2 # SKIP Command failed so no diff ok pep_tests-test5_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_4.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test5_5.counts not ok pep_tests-test5_5 # Error code: 14 not ok pep_tests-test5_4 # Error code: 14 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39363] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9a89d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39370] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39370] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39364] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f96a73000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39369] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39369] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test5_5 # SKIP Command failed so no diff ok pep_tests-test5_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test6_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test6_2.counts not ok pep_tests-test6_2 # Error code: 14 not ok pep_tests-test6_1+pep_type-toar # Error code: 14 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39423] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fae46e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39430] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39430] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_2 # SKIP Command failed so no diff # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39424] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9480d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39429] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39429] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test7_1.counts not ok pep_tests-test6_1+pep_type-qarnoldi # Error code: 14 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39458] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fad215000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39472] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39472] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff not ok pep_tests-test7_1 # Error code: 14 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39474] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f855ef000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test8_1.counts not ok pep_tests-test6_1+pep_type-linear # Error code: 14 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39493] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8db58000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39519] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39519] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test6_1 # SKIP Command failed so no diff not ok pep_tests-test8_1 # Error code: 14 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39521] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f83ae3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39524] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39524] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tests-test9_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1.counts not ok pep_tests-test9_1 # Error code: 14 not ok pep_tutorials-ex16_1+pep_type-toar # Error code: 14 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb23c2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39578] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9ab12000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39584] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39584] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tests-test9_1 # SKIP Command failed so no diff ok pep_tutorials-ex16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_linear.counts not ok pep_tutorials-ex16_1+pep_type-qarnoldi # Error code: 14 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f95e48000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39626] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39626] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_linear_symm.counts not ok pep_tutorials-ex16_1_linear # Error code: 14 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39628] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb05be000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39631] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39631] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex16_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_stoar.counts not ok pep_tutorials-ex16_1_linear_symm # Error code: 14 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39670] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa488e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39686] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39686] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_linear_symm # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex16_1_stoar_t.counts not ok pep_tutorials-ex16_1_stoar # Error code: 14 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39688] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2064000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39691] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39691] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex16_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex17_1.counts not ok pep_tutorials-ex16_1_stoar_t # Error code: 14 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39732] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f97dcb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39746] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39746] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex16_1_stoar_t # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex28_1.counts not ok pep_tutorials-ex17_1+pep_type-toar # Error code: 14 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39748] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f90112000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39751] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39751] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex17_1 # SKIP Command failed so no diff not ok pep_tutorials-ex17_1+pep_type-qarnoldi # Error code: 14 not ok pep_tutorials-ex28_1+pep_type-toar # Error code: 14 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39790] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9b225000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39797] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39797] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex17_1 # SKIP Command failed so no diff # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39792] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fabc4e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39798] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39798] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff not ok pep_tutorials-ex28_1+pep_type-qarnoldi # Error code: 14 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39826] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8349e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39829] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39829] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff not ok pep_tutorials-ex17_1+pep_type-linear # Error code: 14 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39825] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8e8c6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39832] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39832] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex38_1.counts not ok pep_tutorials-ex28_1+pep_type-linear # Error code: 14 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39853] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f7f782000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39876] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39876] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex28_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex38_2.counts not ok pep_tutorials-ex38_1 # Error code: 14 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39874] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8188d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39879] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39879] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex38_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex40_1.counts not ok pep_tutorials-ex38_2 # Error code: 14 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39926] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f97ed8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39936] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39936] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex38_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex40_1_transform.counts not ok pep_tutorials-ex40_1 # Error code: 14 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39933] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f945f2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39939] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39939] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex40_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex50_1.counts not ok pep_tutorials-ex40_1_transform # Error code: 14 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39986] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb2a64000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39996] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39996] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex40_1_transform # SKIP Command failed so no diff not ok pep_tutorials-ex50_1+pep_type-toar # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials-ex50_1_linear.counts # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:39993] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa706d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:39999] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:39999] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex50_1 # SKIP Command failed so no diff not ok pep_tutorials-ex50_1+pep_type-qarnoldi # Error code: 14 not ok pep_tutorials-ex50_1_linear # Error code: 14 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40033] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa4800000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40043] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40043] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials-ex50_1 # SKIP Command failed so no diff # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40040] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8d63d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40046] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40046] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials-ex50_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_1_stoar.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_1_stoar # Error code: 14 not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-toar # Error code: 14 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40098] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8ff1a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40105] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40105] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40100] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb6f60000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40106] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40106] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff ok pep_tutorials_nlevp-acoustic_wave_1d_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_2.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40133] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f89364000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40148] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40148] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-none # Error code: 14 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40150] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f95b13000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40153] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40153] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-norm # Error code: 14 not ok pep_tutorials_nlevp-acoustic_wave_1d_1+pep_type-linear # Error code: 14 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40181] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f92c6c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40187] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40187] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40169] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa7ca6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40186] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40186] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff ok pep_tutorials_nlevp-acoustic_wave_1d_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_3.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_2+pep_extract-residual # Error code: 14 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40214] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8d0b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40231] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40231] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_1d_4.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-none # Error code: 14 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40228] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa9248000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40234] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40234] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-norm # Error code: 14 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40270] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9e0db000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40278] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40278] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_1d_4 # Error code: 14 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40275] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9705c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40281] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40281] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_1d_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_1.counts not ok pep_tutorials_nlevp-acoustic_wave_1d_3+pep_extract-residual # Error code: 14 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40302] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa2b0d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40325] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40325] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_1d_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_1_toar.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_1+pep_type-qarnoldi # Error code: 14 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40324] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9055f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40328] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40328] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_2d_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-acoustic_wave_2d_1+pep_type-linear # Error code: 14 not ok pep_tutorials_nlevp-acoustic_wave_2d_1_toar # Error code: 14 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40369] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9f0fc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40375] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40375] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40366] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f88f26000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40374] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40374] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_2d_1_toar # SKIP Command failed so no diff ok pep_tutorials_nlevp-acoustic_wave_2d_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b.counts not ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b # Error code: 14 not ok pep_tutorials_nlevp-acoustic_wave_2d_2 # Error code: 14 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40429] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fa327b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40435] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40435] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_b # SKIP Command failed so no diff # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40428] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f94b6c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40434] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40434] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-acoustic_wave_2d_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_1_toar.counts not ok pep_tutorials_nlevp-butterfly_1_toar # Error code: 14 not ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab # Error code: 14 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40489] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa7442000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40494] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40494] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40488] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa5a60000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40495] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40495] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-acoustic_wave_2d_2_lin_ab # SKIP Command failed so no diff ok pep_tutorials_nlevp-butterfly_1_toar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_1_linear.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-butterfly_2.counts not ok pep_tutorials_nlevp-butterfly_1_linear # Error code: 14 not ok pep_tutorials_nlevp-butterfly_2+pep_type-toar # Error code: 14 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40547] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f98525000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40555] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40555] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40549] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb8b24000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40554] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40554] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_2 # SKIP Command failed so no diff ok pep_tutorials_nlevp-butterfly_1_linear # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1.counts not ok pep_tutorials_nlevp-butterfly_2+pep_type-linear # Error code: 14 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40582] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8e3d0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40597] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40597] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-butterfly_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1_qarnoldi.counts not ok pep_tutorials_nlevp-damped_beam_1+pep_type-toar # Error code: 14 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40599] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb10ff000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40602] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40602] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-damped_beam_1+pep_type-linear # Error code: 14 not ok pep_tutorials_nlevp-damped_beam_1_qarnoldi # Error code: 14 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40639] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7c67000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40646] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40646] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40643] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faec7b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40649] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40649] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1_qarnoldi # SKIP Command failed so no diff ok pep_tutorials_nlevp-damped_beam_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-damped_beam_1_jd.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-loaded_string_1.counts not ok pep_tutorials_nlevp-loaded_string_1 # Error code: 14 not ok pep_tutorials_nlevp-damped_beam_1_jd # Error code: 14 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40703] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8fcb5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40708] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40708] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-damped_beam_1_jd # SKIP Command failed so no diff # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40702] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbd3b4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40709] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40709] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-loaded_string_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-planar_waveguide_1.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_1.counts not ok pep_tutorials_nlevp-sleeper_1+pep_type-toar # Error code: 14 not ok pep_tutorials_nlevp-planar_waveguide_1+pep_type-toar # Error code: 14 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40762] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9929d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40769] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40769] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40763] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8c8df000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40768] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40768] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-planar_waveguide_1 # SKIP Command failed so no diff ok pep_tutorials_nlevp-sleeper_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-sleeper_1+pep_type-linear # Error code: 14 not ok pep_tutorials_nlevp-planar_waveguide_1+pep_type-linear # Error code: 14 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40797] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f82ff5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40803] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40803] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-sleeper_1 # SKIP Command failed so no diff # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40796] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fbbac8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40802] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40802] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-planar_waveguide_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_1_qarnoldi.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_2_toar.counts not ok pep_tutorials_nlevp-sleeper_1_qarnoldi # Error code: 14 not ok pep_tutorials_nlevp-sleeper_2_toar # Error code: 14 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40856] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faecb5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40863] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40863] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40857] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa3e87000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40862] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40862] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-sleeper_1_qarnoldi # SKIP Command failed so no diff ok pep_tutorials_nlevp-sleeper_2_toar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_2_jd.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_3.counts not ok pep_tutorials_nlevp-sleeper_3 # Error code: 14 not ok pep_tutorials_nlevp-sleeper_2_jd # Error code: 14 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbd5eb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40917] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fae836000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40922] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40922] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-sleeper_2_jd # SKIP Command failed so no diff ok pep_tutorials_nlevp-sleeper_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-sleeper_4.counts TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1.counts not ok pep_tutorials_nlevp-sleeper_4 # Error code: 14 not ok pep_tutorials_nlevp-spring_1+pep_type-toar # Error code: 14 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40977] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f86b6c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40982] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40982] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:40976] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f7fead000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:40983] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:40983] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-spring_1 # SKIP Command failed so no diff ok pep_tutorials_nlevp-sleeper_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1_stoar.counts not ok pep_tutorials_nlevp-spring_1+pep_type-linear # Error code: 14 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41010] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e99a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41025] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41025] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_1_qarnoldi.counts not ok pep_tutorials_nlevp-spring_1_stoar # Error code: 14 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41027] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa246e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41030] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41030] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-spring_1_stoar # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_2.counts not ok pep_tutorials_nlevp-spring_1_qarnoldi # Error code: 14 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41070] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9e871000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41085] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41085] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_1_qarnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_3.counts not ok pep_tutorials_nlevp-spring_2 # Error code: 14 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41087] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f8ad3d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41090] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41090] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-spring_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_4.counts not ok pep_tutorials_nlevp-spring_3 # Error code: 14 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41131] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9114a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41146] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41146] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_5.counts not ok pep_tutorials_nlevp-spring_4 # Error code: 14 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41147] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fba30e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41150] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41150] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-spring_6.counts not ok pep_tutorials_nlevp-spring_5 # Error code: 14 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41192] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9a822000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41207] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41207] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-spring_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1.counts not ok pep_tutorials_nlevp-spring_6 # Error code: 14 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41206] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f93f72000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41210] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41210] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-spring_6 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_h1.counts not ok pep_tutorials_nlevp-wiresaw_1+pep_type-toar # Error code: 14 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb3c2e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41267] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41267] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_1_linear_h1 # Error code: 14 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41265] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fad590000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41270] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41270] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_1_linear_h1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_h2.counts not ok pep_tutorials_nlevp-wiresaw_1+pep_type-qarnoldi # Error code: 14 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41289] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9d41a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41307] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41307] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_1_linear_other.counts not ok pep_tutorials_nlevp-wiresaw_1_linear_h2 # Error code: 14 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41314] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f908c4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41317] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41317] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_1_linear_h2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2.counts not ok pep_tutorials_nlevp-wiresaw_1_linear_other # Error code: 14 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41347] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f8e17d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41367] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41367] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_1_linear_other # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2_linear.counts not ok pep_tutorials_nlevp-wiresaw_2+pep_type-toar # Error code: 14 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41374] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f83dbe000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41377] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41377] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_2 # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_2_linear+pep_linear_linearization-1,0 # Error code: 14 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41407] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9dd3f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41421] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41421] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2_linear # SKIP Command failed so no diff not ok pep_tutorials_nlevp-wiresaw_2+pep_type-qarnoldi # Error code: 14 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41418] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f895cd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41424] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41424] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/pep_tutorials_nlevp-wiresaw_2_linear_other.counts not ok pep_tutorials_nlevp-wiresaw_2_linear+pep_linear_linearization-0,1 # Error code: 14 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41443] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9e07b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41461] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41461] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok pep_tutorials_nlevp-wiresaw_2_linear # SKIP Command failed so no diff RM test-rm-pep.F90 TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test1_1_real.counts not ok pep_tutorials_nlevp-wiresaw_2_linear_other # Error code: 14 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41468] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fa1fa5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41471] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41471] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok pep_tutorials_nlevp-wiresaw_2_linear_other # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test1_3_real.counts not ok nep_tests-test1_1_real+nep_type-rii # Error code: 14 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41506] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9cf43000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41525] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41525] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test1_1_real # SKIP Command failed so no diff not ok nep_tests-test1_3_real # Error code: 14 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9c46f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41532] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41532] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test1_3_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1.counts not ok nep_tests-test1_1_real+nep_type-slp # Error code: 14 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41546] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb5150000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41552] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41552] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test1_1_real # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_rii.counts not ok nep_tests-test10_1_rii+split-0 # Error code: 14 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fbf1a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41611] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41614] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41615] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41614] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41615] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41611@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test10_1_rii # SKIP Command failed so no diff not ok nep_tests-test10_1 # Error code: 14 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f96899000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41579] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41576] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41579] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41580] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41576@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test10_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_narnoldi.counts not ok nep_tests-test10_1_narnoldi # Error code: 14 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa5235000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41664] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41667] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41668] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41668] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:41667] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41664@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_narnoldi # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_slp.counts not ok nep_tests-test10_1_slp+split-0 # Error code: 14 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f808ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41697] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41700] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41701] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41701] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41700] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41697@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_slp # SKIP Command failed so no diff not ok nep_tests-test10_1_rii+split-1 # Error code: 14 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f927a2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41635] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41631] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41634] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41635] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41634] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41631@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_rii # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_interpol.counts not ok nep_tests-test10_1_slp+split-1 # Error code: 14 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f80786000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41717] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41721] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41720] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41720] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41721] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41717@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_slp # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_1_narnoldi_sync.counts not ok nep_tests-test10_1_interpol # Error code: 14 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41750] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97934000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41754] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41753] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41754] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41753] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41750@1,1] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tests-test10_1_interpol # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_2_interpol.counts not ok nep_tests-test10_1_narnoldi_sync # Error code: 14 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f81714000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41787] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41783] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:41786] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41787] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41786] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-41783@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test10_1_narnoldi_sync # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test10_2_nleigs_real.counts not ok nep_tests-test10_2_interpol # Error code: 14 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41833] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb98c5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41844] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41844] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_interpol # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test12_1.counts not ok nep_tests-test10_2_nleigs_real+split-0 # Error code: 14 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41842] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f86f57000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41847] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41847] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_nleigs_real # SKIP Command failed so no diff not ok nep_tests-test10_2_nleigs_real+split-1 # Error code: 14 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41883] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fbc94b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41891] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41891] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test10_2_nleigs_real # SKIP Command failed so no diff not ok nep_tests-test12_1 # Error code: 14 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41888] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa45ac000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41894] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41894] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test12_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test13_1.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test14_1.counts not ok nep_tests-test13_1 # Error code: 14 not ok nep_tests-test14_1 # Error code: 14 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41944] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f95dc7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41951] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41951] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test13_1 # SKIP Command failed so no diff # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:41948] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f862f9000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:41954] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:41954] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test14_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test15_1.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test16_1.counts not ok nep_tests-test15_1 # Error code: 14 not ok nep_tests-test16_1 # Error code: 14 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42007] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f935ea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42014] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42014] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42008] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb2d87000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42013] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42013] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test16_1 # SKIP Command failed so no diff ok nep_tests-test15_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_1.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_2_interpol.counts not ok nep_tests-test17_2_interpol # Error code: 14 not ok nep_tests-test17_1+nep_two_sided-0_split-0 # Error code: 14 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42067] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa15a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42073] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42073] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42068] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3faca0f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42074] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42074] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test17_2_interpol # SKIP Command failed so no diff ok nep_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test17_2_nleigs_real.counts not ok nep_tests-test17_1+nep_two_sided-0_split-1 # Error code: 14 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42101] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f8ddf6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42116] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42116] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_1 # SKIP Command failed so no diff not ok nep_tests-test17_2_nleigs_real+split-0 # Error code: 14 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42118] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f96b68000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42121] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42121] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_2_nleigs_real # SKIP Command failed so no diff not ok nep_tests-test17_1+nep_two_sided-1_split-0 # Error code: 14 not ok nep_tests-test17_2_nleigs_real+split-1 # Error code: 14 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42137] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa54b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42155] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42155] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42149] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fb9c4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42154] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42154] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test17_2_nleigs_real # SKIP Command failed so no diff ok nep_tests-test17_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test2_1.counts not ok nep_tests-test17_1+nep_two_sided-1_split-1 # Error code: 14 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42182] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f87a4b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42199] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42199] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test17_1 # SKIP Command failed so no diff not ok nep_tests-test2_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test3_1.counts # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42196] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa39c7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42202] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42202] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test3_1_ts.counts not ok nep_tests-test3_1 # Error code: 14 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42249] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8e935000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42259] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42259] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test4_1.counts not ok nep_tests-test3_1_ts # Error code: 14 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42256] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fadbea000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42262] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42262] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test3_1_ts # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test5_1.counts not ok nep_tests-test4_1 # Error code: 14 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42309] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9590d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42319] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42319] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test4_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test5_3.counts not ok nep_tests-test5_1 # Error code: 14 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42316] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fa6a7f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42322] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42322] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test6_1.counts not ok nep_tests-test5_3 # Error code: 14 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42369] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8c51d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42379] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42379] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test5_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test7_1.counts not ok nep_tests-test6_1 # Error code: 14 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42376] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f97b88000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42382] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42382] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test6_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test7_2.counts not ok nep_tests-test7_1+nsize-1 # Error code: 14 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42429] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbc345000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42439] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42439] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test7_1 # SKIP Command failed so no diff not ok nep_tests-test7_2+nsize-1 # Error code: 14 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42436] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faee2f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42442] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42442] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test7_2 # SKIP Command failed so no diff not ok nep_tests-test7_1+nsize-2 # Error code: 14 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42461] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f98a7d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42474] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42473] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42473] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42474] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42461@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test7_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_1.counts not ok nep_tests-test8_1 # Error code: 14 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42511] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f81189000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42514] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42514] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_2.counts not ok nep_tests-test7_2+nsize-2 # Error code: 14 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f9c074000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42470] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42478] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42477] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42478] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42477] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42470@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tests-test7_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_3.counts not ok nep_tests-test8_2 # Error code: 14 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42551] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f8e3ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42566] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42566] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test8_4.counts not ok nep_tests-test8_3 # Error code: 14 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42569] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f94cc3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42572] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42572] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tests-test8_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tests-test9_1.counts not ok nep_tests-test8_4 # Error code: 14 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42606] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9e305000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42624] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42624] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test8_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_1.counts not ok nep_tests-test9_1 # Error code: 14 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42629] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9c90b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42632] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42632] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tests-test9_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_2.counts not ok nep_tutorials-ex20_1 # Error code: 14 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42667] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fad3a3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42685] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42685] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex20_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_3.counts not ok nep_tutorials-ex20_2 # Error code: 14 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42689] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f95edc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42692] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42692] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex20_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex20_4.counts not ok nep_tutorials-ex20_3+nep_two_sided-0 # Error code: 14 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42727] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbe137000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42745] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42745] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_3 # SKIP Command failed so no diff not ok nep_tutorials-ex20_4 # Error code: 14 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42749] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f88c4e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42752] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42752] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex21_1_rii.counts not ok nep_tutorials-ex20_3+nep_two_sided-1 # Error code: 14 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42768] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3face7c000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42776] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42776] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex20_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex21_1_slp.counts not ok nep_tutorials-ex21_1_rii+nsize-1 # Error code: 14 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42796] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fadadf000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42799] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42799] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex21_1_rii # SKIP Command failed so no diff not ok nep_tutorials-ex21_1_slp+nsize-1 # Error code: 14 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42828] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9c8b0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42837] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42837] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex21_1_slp # SKIP Command failed so no diff not ok nep_tutorials-ex21_1_rii+nsize-2 # Error code: 14 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42843] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fbc877000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42847] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42847] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42843@1,0] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tutorials-ex21_1_rii # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_1.counts not ok nep_tutorials-ex22_1+nep_type-rii # Error code: 14 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42894] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f7fb41000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42897] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42897] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_1 # SKIP Command failed so no diff not ok nep_tutorials-ex21_1_slp+nsize-2 # Error code: 14 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8b60b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42865] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:42886] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:42885] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42886] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42885] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-42865@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tutorials-ex21_1_slp # SKIP Command failed so no diff not ok nep_tutorials-ex22_1+nep_type-slp # Error code: 14 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42915] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8c17f000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42918] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42918] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_2.counts not ok nep_tutorials-ex22_1+nep_type-narnoldi # Error code: 14 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42949] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f7fa44000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42960] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42960] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3.counts not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-none # Error code: 14 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42958] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8468d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:42963] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:42963] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_2 # SKIP Command failed so no diff not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-norm # Error code: 14 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:42999] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fb31ce000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43007] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43007] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_2 # SKIP Command failed so no diff not ok nep_tutorials-ex22_3+nep_type-rii # Error code: 14 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43004] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa7ba4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43010] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43010] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_3 # SKIP Command failed so no diff not ok nep_tutorials-ex22_2+nep_interpol_pep_extract-residual # Error code: 14 not ok nep_tutorials-ex22_3+nep_type-slp # Error code: 14 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43034] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f99f7a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43044] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43044] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_2 # SKIP Command failed so no diff # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43038] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f942f6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43042] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43042] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_simpleu.counts not ok nep_tutorials-ex22_3+nep_type-narnoldi # Error code: 14 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43072] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fb2189000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43088] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43088] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_3 # SKIP Command failed so no diff not ok nep_tutorials-ex22_3_simpleu+nep_type-rii # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_slp_thres.counts # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43085] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fa8020000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43091] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43091] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff not ok nep_tutorials-ex22_3_simpleu+nep_type-slp # Error code: 14 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43125] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f8e89b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43135] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43135] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff not ok nep_tutorials-ex22_3_slp_thres # Error code: 14 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43132] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3faccdb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43138] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43138] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_slp_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_3_rii_thres.counts not ok nep_tutorials-ex22_3_simpleu+nep_type-narnoldi # Error code: 14 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43155] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3faaa80000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43179] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43179] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_simpleu # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex22_4.counts not ok nep_tutorials-ex22_3_rii_thres # Error code: 14 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43182] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa5ac4000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43185] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43185] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_3_rii_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_1.counts not ok nep_tutorials-ex22_4 # Error code: 14 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43226] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f9069a000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43242] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43242] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex22_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_3.counts not ok nep_tutorials-ex27_1 # Error code: 14 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43241] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fbaf69000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43245] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43245] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex27_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_2.counts not ok nep_tutorials-ex27_3 # Error code: 14 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43291] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f85a06000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43302] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43302] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_4.counts not ok nep_tutorials-ex27_2 # Error code: 14 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43301] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f9593d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43305] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43305] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials-ex27_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex27_9.counts not ok nep_tutorials-ex27_4 # Error code: 14 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43351] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbbbc0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43362] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43362] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials-ex42_1.counts not ok nep_tutorials-ex27_9 # Error code: 14 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43360] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fa5738000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43365] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43365] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex27_9 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_1.counts not ok nep_tutorials-ex42_1 # Error code: 14 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43409] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa7aeb000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43422] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43422] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials-ex42_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_2.counts not ok nep_tutorials_nlevp-loaded_string_1 # Error code: 14 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43421] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa1412000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43425] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43425] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_2_mbe.counts not ok nep_tutorials_nlevp-loaded_string_2+nep_refine_scheme-schur # Error code: 14 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43471] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3fb2ff6000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43482] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43482] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_2 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_2_mbe # Error code: 14 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43481] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f8f183000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43485] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43485] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_2_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_3_explicit.counts not ok nep_tutorials_nlevp-loaded_string_2+nep_refine_scheme-explicit # Error code: 14 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43501] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb1d05000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43519] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43519] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_3_mbe.counts not ok nep_tutorials_nlevp-loaded_string_3_explicit # Error code: 14 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43529] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f95bf2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43532] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43533] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43533] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43532] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43529@1,0] # Exit code: 14 # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_3_explicit # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_4.counts not ok nep_tutorials_nlevp-loaded_string_4 # Error code: 14 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9ce01000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43596] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43596] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43593] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43597] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43593] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # [sbuild:43593] PMIX ERROR: PMIX_ERR_UNREACH in file ../../../../../src/mca/ptl/base/ptl_base_connection_hdlr.c at line 120 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43597] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_4 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_5.counts not ok nep_tutorials_nlevp-loaded_string_3_mbe # Error code: 14 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fb091d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43583] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43564] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # [sbuild:43580] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43583] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # prterun detected that one or more processes exited with non-zero status, # thus causing the job to be terminated. The first process to do so was: # # Process name: [prterun-sbuild-43564@1,1] # Exit code: 14 # -------------------------------------------------------------------------- ok nep_tutorials_nlevp-loaded_string_3_mbe # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_6.counts not ok nep_tutorials_nlevp-loaded_string_5 # Error code: 14 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43636] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbe08e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43639] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43639] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_5 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_7.counts not ok nep_tutorials_nlevp-loaded_string_6 # Error code: 14 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43683] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f9dd7b000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43694] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43694] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_6 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_7 # Error code: 14 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43693] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fa8ab2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43697] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43697] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_7 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8.counts TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_rii_thres.counts not ok nep_tutorials_nlevp-loaded_string_8+nep_type-rii # Error code: 14 not ok nep_tutorials_nlevp-loaded_string_8_rii_thres # Error code: 14 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43750] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fa87ae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43757] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43757] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43751] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3fa58b5000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43756] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43756] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff ok nep_tutorials_nlevp-loaded_string_8_rii_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_slp_thres.counts not ok nep_tutorials_nlevp-loaded_string_8+nep_type-slp # Error code: 14 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43784] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f9384d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43799] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43799] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff not ok nep_tutorials_nlevp-loaded_string_8_slp_thres # Error code: 14 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43801] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3f8c934000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43804] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43804] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8_slp_thres # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/nep_tutorials_nlevp-loaded_string_8_slp_two_thres.counts not ok nep_tutorials_nlevp-loaded_string_8+nep_type-narnoldi # Error code: 14 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43820] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f9230e000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43846] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43846] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok nep_tutorials_nlevp-loaded_string_8 # SKIP Command failed so no diff RM test-rm-nep.F90 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test1_1.counts not ok nep_tutorials_nlevp-loaded_string_8_slp_two_thres # Error code: 14 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43848] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3fbb896000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43851] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43851] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok nep_tutorials_nlevp-loaded_string_8_slp_two_thres # SKIP Command failed so no diff ok mfn_tests-test1_1 # SKIP Requires DATAFILESPATH TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test2_1.counts TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test2_3.counts not ok mfn_tests-test2_3 # Error code: 14 not ok mfn_tests-test2_1+mfn_type-krylov # Error code: 14 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43919] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaabc000000 # Acquired Address: 0x3f84bae000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43925] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43925] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test2_3 # SKIP Command failed so no diff # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43916] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac8000000 # Acquired Address: 0x3f83ac7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43923] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43923] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok mfn_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_1.counts not ok mfn_tests-test2_1+mfn_type-expokit # Error code: 14 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43954] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3f82bf7000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43968] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43968] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_1_x.counts not ok mfn_tests-test3_1 # Error code: 14 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:43969] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3f91add000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:43972] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:43972] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok mfn_tests-test3_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test3_2.counts not ok mfn_tests-test3_1_x # Error code: 14 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44006] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9622d000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44024] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44024] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test3_1_x # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test4_1.counts not ok mfn_tests-test3_2 # Error code: 14 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44029] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fad5fc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44032] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44032] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test3_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tests-test5_1.counts not ok mfn_tests-test4_1 # Error code: 14 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44065] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3faa8d3000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44083] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44083] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test4_1 # SKIP Command failed so no diff not ok mfn_tests-test5_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex23_1.counts # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44089] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac4000000 # Acquired Address: 0x3fb1725000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44092] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44092] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tests-test5_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex26_1.counts not ok mfn_tutorials-ex23_1 # Error code: 14 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44139] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f906a0000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44149] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44149] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex23_1 # SKIP Command failed so no diff not ok mfn_tutorials-ex26_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex37_1.counts # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44146] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab4000000 # Acquired Address: 0x3fbb612000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44152] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44152] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex26_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex39_1.counts not ok mfn_tutorials-ex37_1 # Error code: 14 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44199] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3fbb9e2000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44209] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44209] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex37_1 # SKIP Command failed so no diff not ok mfn_tutorials-ex39_1 # Error code: 14 TEST installed-arch-linux2-c-opt/tests/counts/mfn_tutorials-ex39_2.counts # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44206] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f9bbfd000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44212] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44212] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex39_1 # SKIP Command failed so no diff RM test-rm-mfn.F90 TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_1.counts not ok mfn_tutorials-ex39_2 # Error code: 14 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44260] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaac0000000 # Acquired Address: 0x3f8fe15000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44270] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44270] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok mfn_tutorials-ex39_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_2.counts not ok lme_tests-test1_1 # Error code: 14 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44267] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3f85967000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44273] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44273] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok lme_tests-test1_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test1_3.counts not ok lme_tests-test1_2 # Error code: 14 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44319] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaacc000000 # Acquired Address: 0x3fbb757000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44330] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44330] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test1_2 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tests-test2_1.counts not ok lme_tests-test1_3 # Error code: 14 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44328] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaaac000000 # Acquired Address: 0x3f868fc000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44333] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44333] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test1_3 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tutorials-ex32_1.counts not ok lme_tests-test2_1 # Error code: 14 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44380] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3fb7344000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44390] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44390] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tests-test2_1 # SKIP Command failed so no diff TEST installed-arch-linux2-c-opt/tests/counts/lme_tutorials-ex32_2.counts not ok lme_tutorials-ex32_1 # Error code: 14 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44387] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f9a593000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44393] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44393] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tutorials-ex32_1 # SKIP Command failed so no diff RM test-rm-eps.c TEST installed-arch-linux2-c-opt/tests/counts/svd_tutorials_cnetwork-embedgsvd_1.counts not ok lme_tutorials-ex32_2 # Error code: 14 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44440] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab8000000 # Acquired Address: 0x3f981a8000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44449] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44449] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! # -------------------------------------------------------------------------- # PMIx_Init failed for the following reason: # # PMIX_ERROR # # Open MPI requires access to a local PMIx server to execute. Please ensure # that either you are operating in a PMIx-enabled environment, or use "mpirun" # to execute the job. # -------------------------------------------------------------------------- # ok lme_tutorials-ex32_2 # SKIP Command failed so no diff RM test-rm-pep.c RM test-rm-nep.c RM test-rm-mfn.c not ok svd_tutorials_cnetwork-embedgsvd_1 # Error code: 14 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1056 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1231 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 1353 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2405 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2460 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../../../../src/mca/gds/shmem2/gds_shmem2.c at line 2476 # [sbuild:44451] PMIX ERROR: PMIX_ERROR in file ../../../src/server/pmix_server.c at line 4721 # -------------------------------------------------------------------------- # The gds/shmem2 component attempted to attach to a shared-memory segment at a # particular base address, but was given a different one. Your job will now likely # abort. # # Requested Address: 0x2aaab0000000 # Acquired Address: 0x3fb6703000 # # If this problem persists, please consider disabling the gds/shmem2 component by # setting in your environment the following: PMIX_MCA_gds=hash # -------------------------------------------------------------------------- # [sbuild:44454] PMIX ERROR: PMIX_ERR_OUT_OF_RESOURCE in file ../../../src/client/pmix_client.c at line 278 # *** An error occurred in MPI_Init_thread # *** on a NULL communicator # *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort, # *** and MPI will try to terminate your MPI job as well) # [sbuild:44454] Local abort before MPI_INIT completed completed successfully, but am not able to aggregate error messages, and not able to guarantee that all other processes were killed! ok svd_tutorials_cnetwork-embedgsvd_1 # SKIP Command failed so no diff RM test-rm-lme.c RM test-rm-svd.c E: Build killed with signal TERM after 150 minutes of inactivity -------------------------------------------------------------------------------- Build finished at 2026-01-05T13:33:47Z Finished -------- +------------------------------------------------------------------------------+ | Cleanup Mon, 05 Jan 2026 13:33:47 +0000 | +------------------------------------------------------------------------------+ Purging /build/reproducible-path Not cleaning session: cloned chroot in use E: Build failure (dpkg-buildpackage died with exit 143) +------------------------------------------------------------------------------+ | Summary Mon, 05 Jan 2026 13:33:54 +0000 | +------------------------------------------------------------------------------+ Build Architecture: riscv64 Build Type: any Build-Space: 873096 Build-Time: 10765 Distribution: unstable Fail-Stage: build Host Architecture: riscv64 Install-Time: 9 Job: /srv/rebuilderd/tmp/rebuilderd1Mmvtg/inputs/slepc_3.24.1+dfsg1-1.dsc Machine Architecture: riscv64 Package: slepc Package-Time: 10817 Source-Version: 3.24.1+dfsg1-1 Space: 873096 Status: attempted Version: 3.24.1+dfsg1-1 -------------------------------------------------------------------------------- Finished at 2026-01-05T13:33:47Z Build needed 03:00:17, 873096k disk space E: Build failure (dpkg-buildpackage died with exit 143) sbuild failed